Attempting to parse JSON at light-speed with Raku and simdjson

This page summarizes the projects mentioned and recommended in the original post on /r/rakulang

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
  • simdjson

    Parsing gigabytes of JSON per second : used by Facebook/Meta Velox, the Node.js runtime, ClickHouse, WatermelonDB, Apache Doris, Milvus, StarRocks

    Hey folks, if you've been in IRC this last week you've probably heard me mumbling to myself, or crying out in agony trying to bind Raku to simdjson. However, the pain was not in vain!

  • JSON-Simd

    Raku bindings to simdjson

    You can now parse JSON (a little bit faster mileage may vary) than JSON::Fast, using: https://github.com/rawleyfowler/JSON-Simd

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

  • krun

    High fidelity benchmark runner

    You don't mention your benchmark data or process. Are your results dominated by overhead, not the conversion of JSON by either solution? If your benchmark processes 1MB of JSON, have you tried 1GB? 10GB? 100GB? Have you tried more rigorous benchmarking (at an extreme, using krun, though I'd defer getting into that until after you've exhausted other factors confounding a reliable benchmark)?

  • raku-Inline-Rust

    Implementation of Rust FFI Omnibus examples with raku NativeCall

    the database example here includes a basic HashMap accessor (with Rust as the provider)

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts