miller
jq
Our great sponsors
miller | jq | |
---|---|---|
63 | 52 | |
8,553 | 29,042 | |
- | 1.9% | |
9.1 | 9.4 | |
6 days ago | 3 days ago | |
Go | C | |
GNU General Public License v3.0 or later | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
miller
- Qsv: Efficient CSV CLI Toolkit
-
jq 1.7 Released
jq and miller[1] are essential parts of my toolbelt, right up there with awk and vim.
[1]: https://github.com/johnkerl/miller
-
Perl first commit: a “replacement” for Awk and sed
> This works really well if your problem can be solved in one or two liners.
My personal comfort threshold is around the 100-line mark. It's even possible to write maintainable shell scripts up to 500 lines, but it mostly depends on the problem you're trying to solve, and the discipline of the programmer to follow best practices (use sane defaults, ShellCheck, etc.).
> It go bad very quickly when, say, you have two CSV files and want to join them the sql-way.
In that case we're talking about structured data, and, yeah, Perl or Python would be easier to work with. That said, depending on the complexity of the CSV, you can still go a long way with plain Bash with IFS/read(1) or tr(1) to split CSV columns. This wouldn't be very robust, but there are tools that handle CSV specifically[1], which can be composed in a shell script just fine.
So it's always a balancing act of being productive quickly with a shell script, or reaching out for a programming language once the tools aren't a good fit, or maintenance becomes an issue.
[1]: https://miller.readthedocs.io/
-
Need help on cleaning this data!!
where mlr is from https://github.com/johnkerl/miller
-
Running weekly average
if this class of problems (i.e., csv/tsv data) is your main target you may find miller (https://github.com/johnkerl/miller) much more useful in the long run
-
GQL: A new SQL like query language for .git files written in Rust
That said, you may be interested in Miller (https://github.com/johnkerl/miller) which provides similar capabilities for CSV, JSON, and XML files. It doesn't use a SQL grammar, but that's just the proverbial lipstick on the thing. I'm not the author, but I have used it and I see some parallels in use cases at the very least.
- johnkerl/miller: Miller is like awk, sed, cut, join, and sort for name-indexed data such as CSV, TSV, and tabular JSON
-
Any cli utility to create ascii/org mode tables?
worth giving Miller a shot
-
I wrote this iCalendar (.ics) command-line utility to turn common calendar exports into more broadly compatible CSV files.
CSV utilities (still haven't pick a favorite one...): https://github.com/harelba/q https://github.com/BurntSushi/xsv https://github.com/wireservice/csvkit https://github.com/johnkerl/miller
- Miller: Like Awk, sed, cut, join, and sort for CSV, TSV, and tabular JSON
jq
- Frawk: An efficient Awk-like programming language. (2021)
- Dehydrated: Letsencrypt/acme client implemented as a shell-script
-
I turned my open-source project into a full-time business
I think like you. But also, one does not necessarily know beforehand that they will want to make money.
Like a project could be born out of pure generosity, but after the happy initial phase the project might get too heavy on the maintenance requirements, causing the author to approach burnout, and possibly deciding that they want to make money to continue pulling the cart forward.
However, here's something I do think: if you create something as Open Source, it should be out of a mentality of goodwill and for the greater good, regardless of how it ends up being used. OSS licenses do mean this with their terms. If you later get tired or burned out, you should just retire and allow the community to keep taking care of it. Just like it happened with the Jq tool [1].
[1]: https://github.com/jqlang/jq/releases/tag/jq-1.7
-
How to load JSON data in PostgreSQL with the the COPY command
In this blog we'll see how to upload the JSON directly using PostgreSQL COPY command and using an utility called jq!
-
How to Recover Locally Deleted Files From Github
And we can then make it easier to find the commit by filtering the response with jq.
-
Essential Command Line Tools for Developers
Official Documentation: jqlang.github.io/jq
-
Command line tools I always install on Ubuntu servers
To handle JSON files and JSON outputs in a script or format and highlight it, jq can be very handy. Many command line tools provide a json output, so you don't have to write a custom parser for a table a list in a terminal. Instead of that, you can use jq to get a specific value from the output or even modify the output. For more information, you can visit https://jqlang.github.io/jq/
-
How I use Nix in my Elm projects
In some projects I've wanted to use HTTPie to test APIs and jq to work with some JSON data. Nix has been really helpful in managing those dependencies that I can't easily get from npm.
-
Gooey: Turn almost any Python command line program into a full GUI application
> I'd love to see programs communicate through a typed JSON/proto format that shed enough details to make this more independent, and get useful shell command structuring/completion or full blown GUIs from simply introspecting the expected input and output types.
You should try PowerShell. It's basically Microsoft's .NET ecosystem molded into an interactive command line. I'm not entirely sure if PoweShell can make full use of the static types that build up its core, but its ability to exchange objects in the command line is almost unmatched.
On Linux you can use `jc` (https://github.com/kellyjonbrazil/jc) combined with `jq` (https://jqlang.github.io/jq/) to glue together command lines.
-
To a Man with `Jq`, Everything Looks Like JSON
Yeah, but muscle memory bites me all the time and I put the backslash on the closing paren, too, because I'm so used to the regex usage of that syntax which needs them to match
I also want to draw the reader's attention to the magic of |@uri <https://github.com/jqlang/jq/blob/jq-1.7/docs/content/manual...> for a bunch of cases, but doubly so in TFA's case where they're plugging strings into a URI context. Simple string concat often works great for "hello world", but the world is not always just hello, so one quick use of the filter and jq's got your back
echo "the world's scary" | jq -Rr '"\(.)"'
What are some alternatives?
visidata - A terminal spreadsheet multitool for discovering and arranging data
yq - Command-line YAML, XML, TOML processor - jq wrapper for YAML/XML/TOML documents
xsv - A fast CSV command line toolkit written in Rust.
jp - Validate and transform JSON with Bash
jq - Command-line JSON processor [Moved to: https://github.com/jqlang/jq]
gojq - Pure Go implementation of jq
dasel - Select, put and delete data from JSON, TOML, YAML, XML and CSV files with a single tool. Supports conversion between formats and can be used as a Go package.
Jolt - JSON to JSON transformation library written in Java.
csvtk - A cross-platform, efficient and practical CSV/TSV toolkit in Golang
yq - yq is a portable command-line YAML, JSON, XML, CSV, TOML and properties processor
jmespath.py - JMESPath is a query language for JSON.