csvq | jq | |
---|---|---|
14 | 54 | |
1,450 | 29,146 | |
- | 1.0% | |
2.7 | 9.3 | |
5 months ago | about 21 hours ago | |
Go | C | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
csvq
-
Fx – Terminal JSON Viewer
sure can do, if you already use that shell [1], but personally I like specific tools for specific jobs such as jq [2], fx, csvq [3] etc, there's value in decoupling shells from utils (modularity, speed, innovation etc).
[1] I don't but tempted to try, like its data-types concept
[2] https://jqlang.github.io/jq/
[3] https://github.com/mithrandie/csvq
-
Tool to interact with CSV
csvq
-
Can SQL be used without an RDBMS?
There is a way of running SQL-like queries against CSV files.
-
Yq is a portable yq: command-line YAML, JSON, XML, CSV and properties processor
Lately I have had to do a lot of flat file analysis and tools along these lines have been a godsend. Will check this out.
My go to lately has been csvq (https://mithrandie.github.io/csvq/). Really nice to be able run complicated selects right over a CSV file with no setup at all.
-
Wie fusioniert man CSV tables?
csvq (https://mithrandie.github.io/csvq/)
-
Tool to explore big data sets
I usually do this with awk, my largest target files being half a TB in size for a project last year (and far too large to hold entirely in RAM). There are some other utilities like csvq and csvsql both of which let you write SQL-style queries against CSV files, but I'm not sure how they perform on large files. There's a nice list of CSV manipulation tools too if any of those jog your memory.
-
sqly - execute SQL against CSV / JSON with shell
Apparently, there were many who thought the same thing; Tools to execute SQL against CSV were trdsql, q, csvq, TextQL. They were highly functional, hoewver, had many options and no input completion. I found it just a little difficult to use.
- One-liner for running queries against CSV files with SQLite
-
Most efficient way to query .CSV files for Mac?
Please check out this tool https://github.com/mithrandie/csvq
-
Looking for: library to turn SQL (or abstracted) to code & execute against custom backend (slice of structs)
If you are looking to query nondb data with sql statements then you may want to check something like https://github.com/mithrandie/csvq (SQL for csv).
jq
-
Data Science at the Command Line, 2nd Edition (2021)
Thanks, if anyone else is interested there is an explanation of this feature here: https://subtxt.in/library-data/2016/03/28/json_stream_jq And: https://github.com/jqlang/jq/wiki/FAQ#streaming-json-parser
The last time I tried, I think the reason I gave up on JQ for large inputs was that the throughput would max out at 7mb/s whereas the same thing with spark SQL on the same hardware (MacBook) would max out at 250mb/s. So I started looking into using other solutions for big data while I use jq in parallel for small data in multiple files.
I will test it out again cause this was 4-5 years ago when I last tested it, but I believe jaq is still preferred for large inputs. Still I prefer for big data to use Spark/Polars/clickhouse etc.
-
Bytecode VMs in Surprising Places
Looks like you are correct https://github.com/jqlang/jq/blob/ed8f7154f4e3e0a8b01e6778de...
- Frawk: An efficient Awk-like programming language. (2021)
- Dehydrated: Letsencrypt/acme client implemented as a shell-script
-
I turned my open-source project into a full-time business
I think like you. But also, one does not necessarily know beforehand that they will want to make money.
Like a project could be born out of pure generosity, but after the happy initial phase the project might get too heavy on the maintenance requirements, causing the author to approach burnout, and possibly deciding that they want to make money to continue pulling the cart forward.
However, here's something I do think: if you create something as Open Source, it should be out of a mentality of goodwill and for the greater good, regardless of how it ends up being used. OSS licenses do mean this with their terms. If you later get tired or burned out, you should just retire and allow the community to keep taking care of it. Just like it happened with the Jq tool [1].
[1]: https://github.com/jqlang/jq/releases/tag/jq-1.7
-
How to load JSON data in PostgreSQL with the the COPY command
In this blog we'll see how to upload the JSON directly using PostgreSQL COPY command and using an utility called jq!
-
How to Recover Locally Deleted Files From Github
And we can then make it easier to find the commit by filtering the response with jq.
-
Essential Command Line Tools for Developers
Official Documentation: jqlang.github.io/jq
-
Command line tools I always install on Ubuntu servers
To handle JSON files and JSON outputs in a script or format and highlight it, jq can be very handy. Many command line tools provide a json output, so you don't have to write a custom parser for a table a list in a terminal. Instead of that, you can use jq to get a specific value from the output or even modify the output. For more information, you can visit https://jqlang.github.io/jq/
-
How I use Nix in my Elm projects
In some projects I've wanted to use HTTPie to test APIs and jq to work with some JSON data. Nix has been really helpful in managing those dependencies that I can't easily get from npm.
What are some alternatives?
querycsv - QueryCSV enables you to load CSV files and manipulate them using SQL queries then after you finish you can export the new values to a CSV file
yq - Command-line YAML, XML, TOML processor - jq wrapper for YAML/XML/TOML documents
q - q - Run SQL directly on delimited files and multi-file sqlite databases
jp - Validate and transform JSON with Bash
yq - yq is a portable command-line YAML, JSON, XML, CSV, TOML and properties processor
gojq - Pure Go implementation of jq
Jolt - JSON to JSON transformation library written in Java.
miller - Miller is like awk, sed, cut, join, and sort for name-indexed data such as CSV, TSV, and tabular JSON
dasel - Select, put and delete data from JSON, TOML, YAML, XML and CSV files with a single tool. Supports conversion between formats and can be used as a Go package.
duckdb - DuckDB is an in-process SQL OLAP Database Management System
jmespath.py - JMESPath is a query language for JSON.