daw_json_link_describe
nativejson-benchmark
daw_json_link_describe | nativejson-benchmark | |
---|---|---|
2 | 10 | |
4 | 1,926 | |
- | - | |
1.8 | 0.0 | |
almost 2 years ago | over 1 year ago | |
C++ | JavaScript | |
GNU General Public License v3.0 or later | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
daw_json_link_describe
-
Show HN: DAW JSON Link
An accompanying project https://github.com/beached/daw_json_link_describe that allows using Boost.Describe reflection like mappings with JSON Link
-
DAW JSON Link v3, a JSON serialization/deserialization library, is released
In this regard, I really like how daw_json_link works: it directly converts the JSON string into/from your custom C++ data type, without needing to have an intermediate DOM representation (but you can do so as well if you really want). The interface for defining the JSON serialization/deserialization for your data type is non-intrusive, which is another plus point. Due to lack of reflection in C++ it still requires some boilerplate, but personally I think it is better than other options I tried in terms of convenience. Furthermore, it is possible to reduce this amount of boilerplate into minimum with the aid of Boost.Describe.
nativejson-benchmark
-
Training great LLMs from ground zero in the wilderness as a startup
Well it would depend on the specifics of the JSON file but eyeballing the stats at https://github.com/miloyip/nativejson-benchmark/tree/master seems to indicate that even on a 2015 MacBook the parsing proceeds using e.g. Configuru parser at several megabytes per second.
- What C++ library do you wish existed but hasn’t been created yet?
-
How can I quickly parse a huge 45MB JSON file using JsonDecoder
Maybe you need to try some other third party json library and see if it helps. This is a good list https://github.com/miloyip/nativejson-benchmark
-
Why is Mastodon so slow?
Glancing at some benchmarks, RapidJSON stringifies at around 250MB/s on a single core (content-dependent, of course). Does not look like a bottleneck.
-
Show HN: DAW JSON Link
How does it compare to the immensely popular JSON for Modern C++ library by nlohmann? https://github.com/nlohmann/json
Also, you should add your library to the JSON benchmarks here: https://github.com/miloyip/nativejson-benchmark#parsing-time
-
Debunking Cloudflare’s recent performance tests
I like your ideas, but they seem difficult to enforce. It assumes good faith on all sides. One of the biggest complaints about AI/ML research results: It is frequently hard/impossible to replicate the results.
One idea: The edge competitors can create a public (SourceHut?) project that runs various daily tests against themselves. This would similar to JSON library benchmarks. [1] Then allow each competitors to continuously tweak there settings to accomplish the task in the shortest amount of time.
Also: It would be nice to see a cost analysis. For years, IBM's DB2 was insanely fast if you could afford to pay outrageous hardware, software license, and consulting costs. I'm not in the edge business, but I guess there are some operators where you can just pay a lot more and get better performance -- if you really need it.
[1] https://github.com/miloyip/nativejson-benchmark
-
How can I parse JSON with C?
There's some useful benchmarks here. I found it while looking for stats on json-c vs parson, which I've used a fair amount.
-
UniValue JSON Library for C++17 (and above)
If you looking for benchmarks to show in which cases your library is better than other 30 or so competitors, then see this repo https://github.com/miloyip/nativejson-benchmark
-
Rocket is a parsing framework for parsing using efficient parsing algorithms
JSON data files from this project: https://github.com/miloyip/nativejson-benchmark
-
How I cut GTA Online loading times by 70%
Such a shame, really. There is a ton fast json parsers there, like https://github.com/miloyip/nativejson-benchmark#parsing-time. And second issue is just hilarious: let's scan array millions of times, who needs hashmaps anyway?
What are some alternatives?
daw_json_link - Fast, convenient JSON serialization and parsing in C++
json-c - https://github.com/json-c/json-c is the official code repository for json-c. See the wiki for release tarballs for download. API docs at http://json-c.github.io/json-c/
simdjson - Parsing gigabytes of JSON per second : used by Facebook/Meta Velox, the Node.js runtime, ClickHouse, WatermelonDB, Apache Doris, Milvus, StarRocks
Jansson - C library for encoding, decoding and manipulating JSON data
json - JSON for Modern C++
EA Standard Template Library - EASTL stands for Electronic Arts Standard Template Library. It is an extensive and robust implementation that has an emphasis on high performance.
RapidJSON - A fast JSON parser/generator for C++ with both SAX/DOM style API
univalue - An easy-to-use and competitively fast JSON parsing library for C++17, forked from Bitcoin Cash Node's own UniValue library.
cpr - C++ Requests: Curl for People, a spiritual port of Python Requests.
text - What a c++ standard Unicode library might look like.
spdlog - Fast C++ logging library.