jzon
jsoniter
Our great sponsors
jzon | jsoniter | |
---|---|---|
8 | 12 | |
133 | 13,010 | |
- | 1.0% | |
7.2 | 0.0 | |
2 days ago | 13 days ago | |
Common Lisp | Go | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
jzon
-
Common Lisp JSON parser?
jzon https://github.com/Zulu-Inuoe/jzon/ is the newest and probably the most complete, the most robust and the most accurate. It explains everything in its readme. I have settled on Shasht so far.
-
How to create a post body for dexador
I think the consensus now for JSON libraries (it's a meme that there are way too many CL JSON libraries) is to use jzon (https://github.com/Zulu-Inuoe/jzon). It's the best one I've found.
-
SBCL Help wanted: capturing big stdout (100M) and json parsing
I use JZON for SAX-style parsing; it works very well. If you can arrange to read your input as a stream, you shouldn't have memory problem with the reading/parsing part of your project.
- JZON hits 1.0 and is at last on the latest QL release: a correct and safe JSON parser, packed with features, and also FASTER than the latest JSON library advertised here.
-
What was your favorite Common Lisp release (implementation, library, tool, ...) in 2021?
jzon, the one JSON parser to rule them all
The readme at https://github.com/Zulu-Inuoe/jzon looks like stringify with :stream supports writing to stream - or do you mean something else?
jsoniter
-
Handling high-traffic HTTP requests with JSON payloads
Since most of the time would be spent decoding json, you could try to cut this time using https://github.com/bytedance/sonic or https://github.com/json-iterator/go, both are drop-in replacements for the stdlib, sonic is faster.
-
A Journey building a fast JSON parser and full JSONPath
We all know the builtin golang JSON parser is slow.
How about doing comparisons against other implementations?
Like this one: https://github.com/json-iterator/go
-
Polygon: Json Database System designed to run on small servers (as low as 16MB) and still be fast and flexible.
Json-iterator (https://github.com/json-iterator/go), you can replace all of encoding/json with this. It does the same thing but it's faster.
-
How can we umarshal a Big JSON effectively?
Do you want to look at every field all at the same time? If not, you can pick out individual fields. There's other packages such as https://github.com/tidwall/gjson or https://github.com/json-iterator/go that let you pass in paths such as "a.b.c" to extract single fields.
-
Designing a config API for microservices applications built using Go
For each Go type used within the config, we generate a separate unmarshaller function. The unmarshallers use json-iterator to process the output from CUE, while tracking the path within the config to the unmarshalled value. This path tracking will allow the function to check if live overrides have been provided on that path and return the override instead.
-
What type of software do you write at your workplace?
https://github.com/json-iterator/go an alternative JSON encoding package which allows to stream (flush out) encoded data as soon as it's able to (which is in contrast with the stock package which buffers everything until the encoding is known to be complete and OK).
-
Some Go(lang) tips
What to use Easyjson is about the top of the pack and it's straightforward. The downside of efficient tools is that they use code generation to create the code required to turn your structs into json to minimise allocations. This is a manual build step which is annoying. Interestingly json-iterator also uses reflection but it's significantly faster. I suspect black magic.
-
What are your favorite packages to use?
jsoniter for low level access to JSON encode and decode
-
What is the best solution to unique data in golang
I think you have to parse the json, if you dont know exactly what you are looking for and want some validation und prevent manual parsing errors. For parsing big json files it is recommend to read and decode it as stream. Here is an example. If you have serious performance criteria take a look at jsoniter. It can be used as 1 to 1 replacement for standard library.
Takes like 10 minutes to write and parses very efficiently. https://github.com/json-iterator/go looks like it can provide such simple parsing
What are some alternatives?
go-json - Fast JSON encoder/decoder compatible with encoding/json for Go
mapstructure - Go library for decoding generic map values into native Go structures and vice versa.
easyjson - Fast JSON serializer for golang.
goprotobuf - Go support for Google's protocol buffers
compare-go-json - A comparison of several go JSON packages.
GJSON - Get JSON values quickly - JSON parser for Go
go-codec - idiomatic codec and rpc lib for msgpack, cbor, json, etc. msgpack.org[Go]
go-serializer - :loop: Serialize any custom type or convert any content to []byte or string, for Go Programming Language
Gin - Gin is a HTTP web framework written in Go (Golang). It features a Martini-like API with much better performance -- up to 40 times faster. If you need smashing performance, get yourself some Gin.
colfer - binary serialization format
spew - Implements a deep pretty printer for Go data structures to aid in debugging
go-sqlite3 - sqlite3 driver for go using database/sql