-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
murex
A smarter shell and scripting environment with advanced features designed for usability, safety and productivity (eg smarter DevOps tooling)
I've been using it successfully for years on Android with sqlite-android [0] and the flexibility it has given me was quite a relief. It's great to see that it is now included by default.
[0] https://github.com/requery/sqlite-android
In case of PostgreSQL there is json and jsonb. For SQLite, hexdump of the database shows text representation and seems to be stored like json than jsonb. I am not aware of the full design and source code but it seems some functions parse and cache the JSON representation.
https://github.com/sqlite/sqlite/blob/a0318fd7b4fbedbce74f13...
https://www.postgresql.org/docs/current/datatype-json.html
> The json and jsonb data types accept almost identical sets of values as input. The major practical difference is one of efficiency. The json data type stores an exact copy of the input text, which processing functions must reparse on each execution; while jsonb data is stored in a decomposed binary format that makes it slightly slower to input due to added conversion overhead, but significantly faster to process, since no reparsing is needed. jsonb also supports indexing, which can be a significant advantage.
Shameless self promotion but this is already possible using my shell https://github.com/lmorg/murex
It works with YAML, TOML, JSON, jsonlines, CSV, and regular shell command output. You can import from any data format and convert to any other data format and even in line SQL relational look ups too.
Since SQL inlining literally just imports the data into an in memory sqlite3 database it means you can do your JSON import into sqlite3 using this shell. And in fact I did literally just this last month when using a cloud service restful API which returned two different JSON docs that needed to restructured into two different tables and then a relational query run between them.
The entire script was basically 3 lines of code. Doing the same in a lower level language would have resulted in multiple for loops, maps etc. Doing the same in any other shell scripting language would have been almost impossible as that would have required dozens of different CLI tools and no easy way to manage any error handling.
Related posts
-
What was the point of [ “x$var” = “xval” ]?
-
murex - a bash-like /bin/bash designed for greater commandline productivity and safer shell scripts
-
Install Asdf: One Runtime Manager to Rule All Dev Environments
-
Ask HN: High quality Python scripts or small libraries to learn from
-
Control Linux based distros using hand gestures using OpenCV, GTK, Mediapipe