sqlpp11
nativejson-benchmark
Our great sponsors
sqlpp11 | nativejson-benchmark | |
---|---|---|
3 | 10 | |
2,352 | 1,926 | |
- | - | |
7.8 | 0.0 | |
5 days ago | over 1 year ago | |
C++ | JavaScript | |
BSD 2-clause "Simplified" License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
sqlpp11
-
What C++ library do you wish existed but hasn’t been created yet?
sqlpp11 actually helps in this area (which imo is the most error prone area of using a database) and offers compile time query checking
-
Using CPP with SQL
I haven't used this, though I did see some presentations about it which got my interest: https://github.com/rbock/sqlpp11
-
I don't want to learn your garbage query language
If you love C++ Template magic look at https://github.com/rbock/sqlpp11
This allows "normal" C++ code, which by the compiler is converted into the query string, allowing code like
for (const auto& row : db(select(all_of(foo)).from(foo).where(foo.hasFun or foo.name == "joker")))
nativejson-benchmark
-
Training great LLMs from ground zero in the wilderness as a startup
Well it would depend on the specifics of the JSON file but eyeballing the stats at https://github.com/miloyip/nativejson-benchmark/tree/master seems to indicate that even on a 2015 MacBook the parsing proceeds using e.g. Configuru parser at several megabytes per second.
- What C++ library do you wish existed but hasn’t been created yet?
-
How can I quickly parse a huge 45MB JSON file using JsonDecoder
Maybe you need to try some other third party json library and see if it helps. This is a good list https://github.com/miloyip/nativejson-benchmark
-
Why is Mastodon so slow?
Glancing at some benchmarks, RapidJSON stringifies at around 250MB/s on a single core (content-dependent, of course). Does not look like a bottleneck.
-
Show HN: DAW JSON Link
How does it compare to the immensely popular JSON for Modern C++ library by nlohmann? https://github.com/nlohmann/json
Also, you should add your library to the JSON benchmarks here: https://github.com/miloyip/nativejson-benchmark#parsing-time
-
Debunking Cloudflare’s recent performance tests
I like your ideas, but they seem difficult to enforce. It assumes good faith on all sides. One of the biggest complaints about AI/ML research results: It is frequently hard/impossible to replicate the results.
One idea: The edge competitors can create a public (SourceHut?) project that runs various daily tests against themselves. This would similar to JSON library benchmarks. [1] Then allow each competitors to continuously tweak there settings to accomplish the task in the shortest amount of time.
Also: It would be nice to see a cost analysis. For years, IBM's DB2 was insanely fast if you could afford to pay outrageous hardware, software license, and consulting costs. I'm not in the edge business, but I guess there are some operators where you can just pay a lot more and get better performance -- if you really need it.
[1] https://github.com/miloyip/nativejson-benchmark
-
How can I parse JSON with C?
There's some useful benchmarks here. I found it while looking for stats on json-c vs parson, which I've used a fair amount.
-
UniValue JSON Library for C++17 (and above)
If you looking for benchmarks to show in which cases your library is better than other 30 or so competitors, then see this repo https://github.com/miloyip/nativejson-benchmark
-
Rocket is a parsing framework for parsing using efficient parsing algorithms
JSON data files from this project: https://github.com/miloyip/nativejson-benchmark
-
How I cut GTA Online loading times by 70%
Such a shame, really. There is a ton fast json parsers there, like https://github.com/miloyip/nativejson-benchmark#parsing-time. And second issue is just hilarious: let's scan array millions of times, who needs hashmaps anyway?
What are some alternatives?
pggen - Generate type-safe Go for any Postgres query. If Postgres can run the query, pggen can generate code for it.
json-c - https://github.com/json-c/json-c is the official code repository for json-c. See the wiki for release tarballs for download. API docs at http://json-c.github.io/json-c/
SqlKata Query Builder - SQL query builder, written in c#, helps you build complex queries easily, supports SqlServer, MySql, PostgreSql, Oracle, Sqlite and Firebird
Jansson - C library for encoding, decoding and manipulating JSON data
libsqldb - Wrapper to different SQL backends
EA Standard Template Library - EASTL stands for Electronic Arts Standard Template Library. It is an extensive and robust implementation that has an emphasis on high performance.
honeysql - Turn Clojure data structures into SQL
univalue - An easy-to-use and competitively fast JSON parsing library for C++17, forked from Bitcoin Cash Node's own UniValue library.
sqlx - general purpose extensions to golang's database/sql
text - What a c++ standard Unicode library might look like.
mysql - MySQL C++ client based on Boost.Asio
simdjson - Parsing gigabytes of JSON per second : used by Facebook/Meta Velox, the Node.js runtime, ClickHouse, WatermelonDB, Apache Doris, Milvus, StarRocks