SBCL Help wanted: capturing big stdout (100M) and json parsing

This page summarizes the projects mentioned and recommended in the original post on /r/lisp

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
  • jsonstreamingparser

    A JSON streaming parser implementation in PHP.

  • In other languages, you generally wouldn't want to load super large JSON files all at once into memory as it uses a crapton of heap space, often creates far too many unnecessary objects, etc. Often you'd do a streaming parser of some kind, essentially discarding parts of the doc after you've processed them. For example (sorry for the dirty PHP link here, but it illustrates the concept): https://github.com/salsify/jsonstreamingparser/

  • json-streams

    Common Lisp library for reading and writing JSON.

  • As /u/kryptonik writes, if the json is really large, slurping all of it into memory is probably a bad idea. Yason doesn't have streaming parsing, but there are at least two other libraries which do: cl-json and json-streams.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • jzon

    A correct and safe(er) JSON RFC 8259 reader/writer with sane defaults.

  • I use JZON for SAX-style parsing; it works very well. If you can arrange to read your input as a stream, you shouldn't have memory problem with the reading/parsing part of your project.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts