Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
Test-data Alternatives
Similar projects and alternatives to test-data
-
simdjson
Parsing gigabytes of JSON per second : used by Facebook/Meta Velox, the Node.js runtime, ClickHouse, WatermelonDB, Apache Doris, Milvus, StarRocks
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
Jsons-for-AHKv2
Jsons.ahk for AHKv2, the lazy man's Json. Handles and converts objects and classes. Also functions as obj -> str converter.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
test-data reviews and mentions
-
New package : lspce - a simple LSP Client for Emacs
Update: I just did some testing on a >20MB JSON file.
-
Jsons.ahk for AHKv2 (feedback needed)
24.90Mb - https://github.com/json-iterator/test-data/raw/master/large-file.json
-
How to fix Emacs constant freezing on long lines?
But the performance of the 30.0.50 you downloaded from the GNU mirror is not far off. It can handle this large JSON file with ease: https://github.com/json-iterator/test-data/blob/master/large-file.json, either with the built-in JSON mode or with tree-sitter.
-
I built a tool to turn your JSON into a database
You can use this for any applications where the data can fit in a JSON file. (e.g. Website CMS, blog, portfolio, small mobile apps, internal tools, …). You’ll be surprised at how much data a 20mb JSON file can hold: https://github.com/json-iterator/test-data/blob/master/large-file.json
-
Show HN: Daba – Turn your JSON into a database
Hi all, I built this tool when I needed a simple LocalStorage-esque database for a client project, and figured others might want something similar.
Basically turns your JSON into a query-able, hosted database in seconds. You can read/update/delete JSON files by path, just like you would in Javascript. So, something like get(”users[7].address”)
And while we’re at it, I also built a simple file storage service where you upload a file and it gives you the URL back.
A lot of my (and other devs friends') side projects will never require more data than a JSON file can handle. Yet we always have to go through the hoops of setting up and using databases meant to handle huge amounts of data.
There are many other services that could benefit from the same minimalist philosophy. The idea is to have a bunch of building blocks of different services, and let the developer scale up/down the complexity as they see fit. I'm working on more services for daba (In no particular order: SQLite db?, auth, emails, …)
You can use this for any applications where the data can fit in a JSON file. (e.g. Website CMS, blog, portfolio, small mobile apps, internal tools, …). You’ll be surprised at how much data a 20mb JSON file can hold: [https://github.com/json-iterator/test-data/blob/master/large...
Let me know what you think
-
To Unmarshal() or To Decode()? JSON Processing in Go Explained
"file.json" is our experimental variable. These will be JSON files of different sizes for each run. The first five JSON files are sourced from JSONPlaceholder - Free Fake REST API. The last JSON file (the largest one) is sourced from test-data/large-file.json at master · json-iterator/test-data · GitHub.
-
A note from our sponsor - InfluxDB
www.influxdata.com | 9 May 2024
Stats
json-iterator/test-data is an open source project licensed under MIT License which is an OSI approved license.
Sponsored