Speeding up Go's builtin JSON encoder up to 55% for large arrays of objects

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
  • encoding

    Go package containing implementations of efficient encoding, decoding, and validation APIs.

  • Would love to see results from incorporating https://github.com/segmentio/encoding/tree/master/json!

  • ndjson.github.io

    Info Website for NDJSON

  • I think this would be fine, as long as the CSV layer was still parsable using the RFC 4180, then you could still use a normal CSV parser to parse the CSV layer and a normal JSON parser to parse the JSON layer. My worry with your example is that it is nether format, so it will need custom serialisation and deserialisation logic as it is essentially a bran new format.

    https://datatracker.ietf.org/doc/html/rfc4180

    If you’re looking for line-oriented JSON, another option would be ndjson: http://ndjson.org/

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts