pronto VS simdjson

Compare pronto vs simdjson and see what are their differences.

simdjson

Parsing gigabytes of JSON per second : used by Facebook/Meta Velox, the Node.js runtime, ClickHouse, WatermelonDB, Apache Doris, Milvus, StarRocks (by simdjson)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
pronto simdjson
4 65
6 18,409
- 0.7%
0.0 9.2
over 3 years ago 6 days ago
Java C++
MIT License Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

pronto

Posts with mentions or reviews of pronto. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-12-08.
  • Buf raises $93M to deprecate REST/JSON
    6 projects | news.ycombinator.com | 8 Dec 2021
    5. Message streaming (gRPC streams are amazing)

    I can think of a whole host of features that can be built off of protos (I've even built ORMs off of protobuffs for simple things [0]). The value prop is there IMO. HTTP + json APIs are a local minima. The biggest concerns "I want to be able to view the data that is being sent back and forth" is a tooling consideration (curl ... isn't showing you the voltages from the physical layer, it is decoded). Buff is building that tooling.

    [0] - https://github.com/CaperAi/pronto

  • Parsing Gigabytes of JSON per Second
    7 projects | news.ycombinator.com | 23 Oct 2021
    I've written translation layers for such systems and it's not too bad. See this project from $job - 1: https://github.com/CaperAi/pronto

    It allowed us to have a single model for storage in the DB, for sending between services, and syncing to edge devices.

  • gRPC for Microservices Communication
    5 projects | news.ycombinator.com | 23 Sep 2021
    There's no reason you couldn't use gRPC with json as a serialized message format. For example grpc-gateway [0] provides a very effective way of mapping a gRPC concept to HTTP/JSON. The thing is, after moving to gRPC, I've never really felt a desire to move back to JSON. While it may be correct to say "parsing json is fast enough" it's important to note that there's a "for most use cases" after that. Parsing protos is fast enough for even more use cases. You also get streams which are amazing for APIs where you have to sync some large amounts of data (listing large collections from a DB for example) across two services.

    With gRPC you also have a standardized middleware API that is implemented for "all" languages. The concepts cleanly map across multiple languages and types are mostly solved for you.

    Adding to that you can easily define some conventions for a proto and make amazing libraries for your team. At a previous job I made this: https://github.com/CaperAi/pronto/

    Made it super easy to prototype multiple services as if you mock a service backed by memory we could plop it into a DB with zero effort.

    I think this "gRPC vs X" method of thinking isn't appropriate here because protos are more like a Object.prototype in JavaScript. They're a template for what you're sending. If you have the Message you want to send you can serialize that to JSON or read from JSON or XML or another propriety format and automatically get a host of cool features (pretty printing, serialization to text/binary, sending over the network, etc).

    [0] - https://github.com/grpc-ecosystem/grpc-gateway

  • We Went All in on Sqlc/Pgx for Postgres and Go
    31 projects | news.ycombinator.com | 8 Sep 2021
    I attempted to make something similar to this except the opposite direction at a previous job. It was called Pronto: https://github.com/CaperAi/pronto/

    It allowed us to store and query Protos into MongoDB. It wasn't perfect (lots of issues) but the idea was rather than specifying custom models for all of our DB logic in our Java code we could write a proto and automatically and code could import that proto and read/write it into the database. This made building tooling to debug issues very easy and make it very simple to hide a DB behind a gRPC API.

    The tool automated the boring stuff. I wish I could have extended this to have you define a service in a .proto and "compile" that into an ORM DAO-like thing automatically so you never need to worry about manually wiring that stuff ever again.

simdjson

Posts with mentions or reviews of simdjson. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-04-20.
  • Tips on adding JSON output to your command line utility. (2021)
    2 projects | news.ycombinator.com | 20 Apr 2024
    It's also supported by simdjson [0] (which has a lot of language bindings [1]):

    > Multithreaded processing of gigantic Newline-Delimited JSON (ndjson) and related formats at 3.5 GB/s

    [0] https://simdjson.org/

    [0] https://github.com/simdjson/simdjson?tab=readme-ov-file#bind...

  • 1BRC Merykitty's Magic SWAR: 8 Lines of Code Explained in 3k Words
    4 projects | news.ycombinator.com | 9 Mar 2024
  • Training great LLMs from ground zero in the wilderness as a startup
    3 projects | news.ycombinator.com | 6 Mar 2024
  • simdjson: Parsing Gigabytes of JSON per Second
    1 project | news.ycombinator.com | 23 Jan 2024
  • Use any web browser as GUI, with Zig in the back end and HTML5 in the front end
    17 projects | news.ycombinator.com | 1 Jan 2024
    String parsing is negligible compared to the speed of the DOM which is glacially slow: https://news.ycombinator.com/item?id=38835920

    Come on, people, make an effort to learn how insanely fast computers are, and how insanely inefficient our software is.

    String parsing can be done at gigabytes per second: https://github.com/simdjson/simdjson If you think that is the slowest operation in the browser, please find some resources that talk about what is actually happening in the browser?

  • Cray-1 performance vs. modern CPUs
    4 projects | news.ycombinator.com | 25 Dec 2023
    Thanks for all the detailed information! That answers a bunch of my questions and the implementation of strlen is nice.

    The instruction I was thinking of is pshufb. An example ‘weird’ use can be found for detecting white space in simdjson: https://github.com/simdjson/simdjson/blob/24b44309fb52c3e2c5...

    This works as follows:

    1. Observe that each ascii whitespace character ends with a different nibble.

    2. Make some vector of 16 bytes which has the white space character whose final nibble is the index of the byte, or some other character with a different final nibble from the byte (eg first element is space =0x20, next could be eg 0xff but not 0xf1 as that ends in the same nibble as index)

    3. For each block where you want to find white space, compute pcmpeqb(pshufb(whitespace, input), input). The rules of pshufb mean (a) non-ascii (ie bit 7 set) characters go to 0 so will compare false, (b) other characters are replaced with an element of whitespace according to their last nibble so will compare equal only if they are that whitespace character.

    I’m not sure how easy it would be to do such tricks with vgather.vv. In particular, the length of the input doesn’t matter (could be longer) but the length of white space must be 16 bytes. I’m not sure how the whole vlen stuff interacts with tricks like this where you (a) require certain fixed lengths and (b) may have different lengths for tables and input vectors. (and indeed there might just be better ways, eg you could imagine an operation with a 256-bit register where you permute some vector of bytes by sign-extending the nth bit of the 256-bit register into the result where the input byte is n).

  • Codebases to read
    5 projects | /r/cpp | 5 Dec 2023
    Additionally, if you like low level stuff, check out libfmt (https://github.com/fmtlib/fmt) - not a big project, not difficult to understand. Or something like simdjson (https://github.com/simdjson/simdjson).
  • Simdjson: Parsing Gigabytes of JSON per Second
    1 project | news.ycombinator.com | 30 Nov 2023
  • Building a high performance JSON parser
    19 projects | news.ycombinator.com | 5 Nov 2023
    Everything you said is totally reasonable. I'm a big fan of napkin math and theoretical upper bounds on performance.

    simdjson (https://github.com/simdjson/simdjson) claims to fully parse JSON on the order of 3 GB/sec. Which is faster than OP's Go whitespace parsing! These tests are running on different hardware so it's not apples-to-apples.

    The phrase "cannot go faster than this" is just begging for a "well ackshully". Which I hate to do. But the fact that there is an existence proof of Problem A running faster in C++ SIMD than OP's Probably B scalar Go is quite interesting and worth calling out imho. But I admit it doesn't change the rest of the post.

  • New package : lspce - a simple LSP Client for Emacs
    4 projects | /r/emacs | 30 Jun 2023
    I have same question as /u/JDRiverRun : how do you deal with JSON, do you parse json on Rust side or on Emacs side. I see that you are requiring json.el in your lspce.el, but I haven't looked through entire file carefully. If you parse on Rust side, do you use simdjson (there are at least two Rust bindings to it)? If yes, what are your impressions, experiences compared to more "standard" json library?

What are some alternatives?

When comparing pronto and simdjson you can also consider the following projects:

simd-json - Rust port of simdjson

RapidJSON - A fast JSON parser/generator for C++ with both SAX/DOM style API

pike - Generate CRUD gRPC backends from single YAML description.

jsoniter - jsoniter (json-iterator) is fast and flexible JSON parser available in Java and Go

sqlparser-rs - Extensible SQL Lexer and Parser for Rust

json - JSON for Modern C++

grpc-gateway - gRPC to JSON proxy generator following the gRPC HTTP spec

json-schema-validator - JSON schema validator for JSON for Modern C++

pggen - A database first code generator focused on postgres

JsonCpp - A C++ library for interacting with JSON.

sqlite

json - A C++11 library for parsing and serializing JSON to and from a DOM container in memory.