q
gron
q | gron | |
---|---|---|
46 | 64 | |
10,126 | 13,520 | |
- | - | |
2.1 | 0.0 | |
12 days ago | 6 months ago | |
Python | Go | |
GNU General Public License v3.0 only | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
q
-
I wrote this iCalendar (.ics) command-line utility to turn common calendar exports into more broadly compatible CSV files.
CSV utilities (still haven't pick a favorite one...): https://github.com/harelba/q https://github.com/BurntSushi/xsv https://github.com/wireservice/csvkit https://github.com/johnkerl/miller
- Segítség kérés Excel automatizáláshoz
-
Show HN: ClickHouse-local – a small tool for serverless data analytics
I think they're talking about https://github.com/harelba/q, which is not very fast.
-
sqly - execute SQL against CSV / JSON with shell
Apparently, there were many who thought the same thing; Tools to execute SQL against CSV were trdsql, q, csvq, TextQL. They were highly functional, hoewver, had many options and no input completion. I found it just a little difficult to use.
-
Q – Run SQL Directly on CSV or TSV Files
Hi, author of q here.
Regarding the error you got, q currently does not autodetect headers, so you'd need to add -H as a flag in order to use the "country" column name. You're absolutely correct on failing-fast here - It's a bug which i'll fix.
In general regarding speed - q supports automatic caching of the CSV files (through the "-C readwrite" flag). Once it's activated, it will write the data into another file (with a .qsql extension), and will use it automatically in further queries in order to speed things considerably.
Effectively, the .qsql files are regular sqlite3 files (with some metadata), and q can be used to query them directly (or any regular sqlite3 file), including the ability to seamlessly join between multiple sqlite3 files.
http://harelba.github.io/q/#auto-caching-examples
- PostgreSQL alternative for Large amounts of data
-
q VS trdsql - a user suggested alternative
2 projects | 25 Jun 2022
- One-liner for running queries against CSV files with SQLite
gron
-
Frawk: An efficient Awk-like programming language. (2021)
gron (https://github.com/tomnomnom/gron) to transform it and query and then invert the transformation?
- Show HN: Flatito, grep for YAML and JSON files
- Gron: Make JSON greppable
-
Make JSON Greppable
It buffers all of its output statements in memory before writing to stdout:
https://github.com/tomnomnom/gron/blob/master/main.go#L204
- Ask HN: What are some unpopular technologies you wish people knew more about?
-
Jaq – A jq clone focused on correctness, speed, and simplicity
Have you tried `gron`?
It converts your nested json into a line by line format which plays better with tools like `grep`
From the project's README:
▶ gron "https://api.github.com/repos/tomnomnom/gron/commits?per_page..." | fgrep "commit.author"
json[0].commit.author = {};
json[0].commit.author.date = "2016-07-02T10:51:21Z";
json[0].commit.author.email = "[email protected]";
json[0].commit.author.name = "Tom Hudson";
https://github.com/tomnomnom/gron
It was suggested to me in HN comments on an article I wrote about `jq`, and I have found myself using it a lot in my day to day workflow
-
Interactive Examples for Learning Jq
> So all I want is a tool to go from json => line oriented and I will do the rest with the vast library of experience I already have at transformations on the command line.*
The tool for that is likely https://github.com/tomnomnom/gron
-
Modern Linux Tools vs. Unix Classics: Which Would I Choose?
If JQ is too much, see GRON &| Miller
gron transforms JSON into discrete assignments to make it easier to grep for what you want https://github.com/tomnomnom/gron
Miller is like awk, sed, cut, join, and sort for data formats such as CSV, TSV, JSON, JSON https://github.com/johnkerl/miller
- XML is better than YAML
-
jq 1.7 Released
And jless [1] and gron [2].
This is the first I'm hearing of gron, but adding here for completeness sake. Meanwhile, JSON seems to be becoming a standard for CLI tools. Ideal scenario would be if every CLI tool has a --json flag or something similar, so that jc is not needed anymore.
[1] https://jless.io/
[2] https://github.com/tomnomnom/gron
What are some alternatives?
textql - Execute SQL against structured text like CSV or TSV
jq - Command-line JSON processor [Moved to: https://github.com/jqlang/jq]
csvq - SQL-like query language for csv
jfq - JSONata on the command line
octosql - OctoSQL is a query tool that allows you to join, analyse and transform data from multiple databases and file formats using SQL.
xidel - Command line tool to download and extract data from HTML/XML pages or JSON-APIs, using CSS, XPath 3.0, XQuery 3.0, JSONiq or pattern matching. It can also create new or transformed XML/HTML/JSON documents.
InquirerPy - :snake: Python port of Inquirer.js (A collection of common interactive command-line user interfaces)
pup - Parsing HTML at the command line
xsv - A fast CSV command line toolkit written in Rust.
JsonPath - Java JsonPath implementation
ledger - Double-entry accounting system with a command-line reporting interface
fx - Terminal JSON viewer & processor