ojg
jq-zsh-plugin
Our great sponsors
ojg | jq-zsh-plugin | |
---|---|---|
17 | 4 | |
794 | 297 | |
- | - | |
7.0 | 6.0 | |
17 days ago | 17 days ago | |
Go | Shell | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
ojg
-
Interactive Examples for Learning Jq
I found Jq to be difficult to use which is why Oj, https://github.com/ohler55/ojg is based on JSONPath. There still are a lot of options but it only takes a couple of help screens to figure out what the options are.
-
Building a high performance JSON parser
You might want to take a look at https://github.com/ohler55/ojg. It takes a different approach with a single pass parser. There are some performance benchmarks included on the README.md landing page.
-
A Journey building a fast JSON parser and full JSONPath
I like the "Simple Encoding Notation" (SEN) of the underlying library: https://github.com/ohler55/ojg/blob/develop/sen.md
- Oj Is on Tap
- SEN: Simple Encoding Notation
- The fastest tool for querying large JSON files is written in Python (benchmark)
-
FX: An interactive alternative to jq to process JSON
Another alternative is the oj app (ojg/cmd/oj) which is part of https://github.com/ohler55/ojg. It relies on JSONPath for extraction and manipulation of JSON.
- Go 1.17 Release Notes
-
OjG now has a tokenizer that is almost 10 times faster than json.Decode
I promise to add more examples but in the mean time there are the test files. The one for Unmarshal is https://github.com/ohler55/ojg/blob/develop/oj/unmashall_test.go
- The Pretty JSON Revolution
jq-zsh-plugin
- Interactive Examples for Learning Jq
-
Analyzing multi-gigabyte JSON files locally
https://github.com/reegnz/jq-zsh-plugin
I find that for big datasets choosing the right format is crucial. Using json-lines format + some shell filtering (eg. head, tail to limit the range, egrep or ripgrep for the more trivial filtering) to reduce the dataset to a couple of megabytes, then use that jq-repl of mine to iterate fast on the final jq expression.
I found that the REPL form factor works really well when you don't exactly know what you're digging for.
What are some alternatives?
jsonparser - One of the fastest alternative JSON parser for Go that does not require schema
semi_index - Implementation of the JSON semi-index described in the paper "Semi-Indexing Semi-Structured Data in Tiny Space"
jsonic - All you need with JSON
z-a-readurl - 🌀 An annex delivers the capability to automatically download the newest version of a file to which URL is hosted on a webpage
fastjson - Fast JSON parser and validator for Go. No custom structs, no code generation, no reflection
json-buffet
ask - A Go package that provides a simple way of accessing nested properties in maps and slices.
lnav - Log file navigator
jettison - Highly configurable, fast JSON encoder for Go
reddit_mining
json2go - Create go type representation from json
ClickHouse - ClickHouse® is a free analytics DBMS for big data