nativejson-benchmark
cpr
nativejson-benchmark | cpr | |
---|---|---|
10 | 22 | |
1,926 | 6,167 | |
- | 1.0% | |
0.0 | 8.4 | |
over 1 year ago | 7 days ago | |
JavaScript | C++ | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
nativejson-benchmark
-
Training great LLMs from ground zero in the wilderness as a startup
Well it would depend on the specifics of the JSON file but eyeballing the stats at https://github.com/miloyip/nativejson-benchmark/tree/master seems to indicate that even on a 2015 MacBook the parsing proceeds using e.g. Configuru parser at several megabytes per second.
- What C++ library do you wish existed but hasn’t been created yet?
-
How can I quickly parse a huge 45MB JSON file using JsonDecoder
Maybe you need to try some other third party json library and see if it helps. This is a good list https://github.com/miloyip/nativejson-benchmark
-
Why is Mastodon so slow?
Glancing at some benchmarks, RapidJSON stringifies at around 250MB/s on a single core (content-dependent, of course). Does not look like a bottleneck.
-
Show HN: DAW JSON Link
How does it compare to the immensely popular JSON for Modern C++ library by nlohmann? https://github.com/nlohmann/json
Also, you should add your library to the JSON benchmarks here: https://github.com/miloyip/nativejson-benchmark#parsing-time
-
Debunking Cloudflare’s recent performance tests
I like your ideas, but they seem difficult to enforce. It assumes good faith on all sides. One of the biggest complaints about AI/ML research results: It is frequently hard/impossible to replicate the results.
One idea: The edge competitors can create a public (SourceHut?) project that runs various daily tests against themselves. This would similar to JSON library benchmarks. [1] Then allow each competitors to continuously tweak there settings to accomplish the task in the shortest amount of time.
Also: It would be nice to see a cost analysis. For years, IBM's DB2 was insanely fast if you could afford to pay outrageous hardware, software license, and consulting costs. I'm not in the edge business, but I guess there are some operators where you can just pay a lot more and get better performance -- if you really need it.
[1] https://github.com/miloyip/nativejson-benchmark
-
How can I parse JSON with C?
There's some useful benchmarks here. I found it while looking for stats on json-c vs parson, which I've used a fair amount.
-
UniValue JSON Library for C++17 (and above)
If you looking for benchmarks to show in which cases your library is better than other 30 or so competitors, then see this repo https://github.com/miloyip/nativejson-benchmark
-
Rocket is a parsing framework for parsing using efficient parsing algorithms
JSON data files from this project: https://github.com/miloyip/nativejson-benchmark
-
How I cut GTA Online loading times by 70%
Such a shame, really. There is a ton fast json parsers there, like https://github.com/miloyip/nativejson-benchmark#parsing-time. And second issue is just hilarious: let's scan array millions of times, who needs hashmaps anyway?
cpr
-
What C++ library do you wish existed but hasn’t been created yet?
This one might fit the bill https://github.com/libcpr/cpr
-
[CMake] Can't include external header in .h file
cmake_minimum_required(VERSION 3.15) project(xrpc++ DESCRIPTION "C++ AT Protocol XRPC library" VERSION 1.0.0 LANGUAGES CXX) include(FetchContent) FetchContent_Declare(cpr GIT_REPOSITORY https://github.com/libcpr/cpr.git GIT_TAG 2553fc41450301cd09a9271c8d2c3e0cf3546b73) # The commit hash for 1.10.x. Replace with the latest from: https://github.com/libcpr/cpr/releases FetchContent_MakeAvailable(cpr) FetchContent_Declare(json URL https://github.com/nlohmann/json/releases/download/v3.11.2/json.tar.xz) FetchContent_MakeAvailable(json) add_library(${PROJECT_NAME} SHARED src/lexicon.cpp src/xrpc.cpp ) target_link_libraries(${PROJECT_NAME} PRIVATE cpr::cpr) target_link_libraries(${PROJECT_NAME} PRIVATE nlohmann_json::nlohmann_json) set_target_properties(${PROJECT_NAME} PROPERTIES VERSION ${PROJECT_VERSION}) set_target_properties(${PROJECT_NAME} PROPERTIES SOVERSION 1) target_include_directories(${PROJECT_NAME} PUBLIC include) set(CMAKE_BUILD_TYPE debug)
include(FetchContent) FetchContent_Declare(cpr GIT_REPOSITORY https://github.com/libcpr/cpr.git GIT_TAG 2553fc41450301cd09a9271c8d2c3e0cf3546b73) # The commit hash for 1.10.x. Replace with the latest from: https://github.com/libcpr/cpr/releases FetchContent_MakeAvailable(cpr)
-
How to convert libcurl to C++?
There is also the cpr package which should offer a more c++ focussed interface for curl.
-
Trying to use libcpr, linking errors - newbie...
So I'm very new to C++ and I'm trying to write a C++ version of a tool that I put together in Python. I'm trying to use libcpr for all my HTTP needs. I've spent the day trying to get it set up and working, but I'm getting a bunch of linking errors when I try to run. I really don't know if I did the building of it correctly, I'm trying to use Visual Studio Community 2022 and the Usage section of their docs talks about CMake and a couple package manager methods.
- Como são feitos os downloaders? (exemplos no texto)
-
Standardise a C++ build tool and package manager?
I think vcpkg manifests have solved a really key portion of the "please give me these libraries" problem. Couple lines in a json file, pass CMake to your vcpkg toolchain script path and triplet, and you're pretty much done with dependencies. I actually used it for a project with libcpr/cpr and a couple other popular libraries, and I was shocked at how painless it was to get up and running with some web request stuff.
-
What are some cool modern libraries you enjoy using?
Libraries like nlohmann's json, cpr, fmt are prime examples of what I'm seeking. Any suggestions?
-
I'm getting a 422 Validation Failed from Github API. Only when making a request with the Cpr library.
Basically specifying the language and the repo, and it does work when the request is made from postman or from the browser. However, when using https://github.com/libcpr/cpr, I'm getting the following response:
- how to make a C++ web scraper?
What are some alternatives?
json-c - https://github.com/json-c/json-c is the official code repository for json-c. See the wiki for release tarballs for download. API docs at http://json-c.github.io/json-c/
libcurl - A command line tool and library for transferring data with URL syntax, supporting DICT, FILE, FTP, FTPS, GOPHER, GOPHERS, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS, MQTT, POP3, POP3S, RTMP, RTMPS, RTSP, SCP, SFTP, SMB, SMBS, SMTP, SMTPS, TELNET, TFTP, WS and WSS. libcurl offers a myriad of powerful features
Jansson - C library for encoding, decoding and manipulating JSON data
C++ REST SDK - The C++ REST SDK is a Microsoft project for cloud-based client-server communication in native code using a modern asynchronous C++ API design. This project aims to help C++ developers connect to and interact with services.
EA Standard Template Library - EASTL stands for Electronic Arts Standard Template Library. It is an extensive and robust implementation that has an emphasis on high performance.
Boost.Beast - HTTP and WebSocket built on Boost.Asio in C++11
univalue - An easy-to-use and competitively fast JSON parsing library for C++17, forked from Bitcoin Cash Node's own UniValue library.
cpp-httplib - A C++ header-only HTTP/HTTPS server and client library
text - What a c++ standard Unicode library might look like.
curlpp - C++ wrapper around libcURL
simdjson - Parsing gigabytes of JSON per second : used by Facebook/Meta Velox, the Node.js runtime, ClickHouse, WatermelonDB, Apache Doris, Milvus, StarRocks
POCO - The POCO C++ Libraries are powerful cross-platform C++ libraries for building network- and internet-based applications that run on desktop, server, mobile, IoT, and embedded systems.