Our great sponsors
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
In other languages, you generally wouldn't want to load super large JSON files all at once into memory as it uses a crapton of heap space, often creates far too many unnecessary objects, etc. Often you'd do a streaming parser of some kind, essentially discarding parts of the doc after you've processed them. For example (sorry for the dirty PHP link here, but it illustrates the concept): https://github.com/salsify/jsonstreamingparser/
As /u/kryptonik writes, if the json is really large, slurping all of it into memory is probably a bad idea. Yason doesn't have streaming parsing, but there are at least two other libraries which do: cl-json and json-streams.
I use JZON for SAX-style parsing; it works very well. If you can arrange to read your input as a stream, you shouldn't have memory problem with the reading/parsing part of your project.
Related posts
- How to create a post body for dexador
- JZON hits 1.0 and is at last on the latest QL release: a correct and safe JSON parser, packed with features, and also FASTER than the latest JSON library advertised here.
- jzon - a correct and safe JSON parser.
- Common Lisp JSON parser?
- Command Line Args for CLisp (specifically for replit)