RecordStream
miller
RecordStream | miller | |
---|---|---|
3 | 63 | |
298 | 8,559 | |
- | - | |
2.6 | 9.0 | |
almost 4 years ago | 7 days ago | |
Perl | Go | |
GNU General Public License v3.0 or later | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
RecordStream
-
Miller – tool for querying, shaping, reformatting data in CSV, TSV, and JSON
It's interesting watching these types of tools get re-invented periodically:
https://github.com/benbernard/RecordStream
It shows the unix model of many small, composable tools is very powerful, but also shows that POSIX is missing some essential pieces that everyone keeps trying to add/reinvent.
-
Miller CLI – Like Awk, sed, cut, join, and sort for CSV, TSV and JSON
I don't know about MillerCLI's portability, but RecordStream (https://github.com/benbernard/RecordStream) is my go to swiss army knife.
-
A Lisp REPL as my main shell (article)
That record/field parsing library would be a tool to handle a broad category of command-line programs. Once the library has broken the input stream into a collection of records and fields, another layer would then turn them into internal representations. The JSON-based RecordStream tools are illustrative here: there are some tools that parse based on a delimiter or a regular expression, some that parse documented generic non-JSON formats like XML, and some that parse application-specific files like tcpdump outputs. In a Lisp world, all of the dedicated stream manipulation tools are redundant, and you avoid parsing and printing at every step in the chain.
miller
- Qsv: Efficient CSV CLI Toolkit
-
jq 1.7 Released
jq and miller[1] are essential parts of my toolbelt, right up there with awk and vim.
[1]: https://github.com/johnkerl/miller
-
Perl first commit: a “replacement” for Awk and sed
> This works really well if your problem can be solved in one or two liners.
My personal comfort threshold is around the 100-line mark. It's even possible to write maintainable shell scripts up to 500 lines, but it mostly depends on the problem you're trying to solve, and the discipline of the programmer to follow best practices (use sane defaults, ShellCheck, etc.).
> It go bad very quickly when, say, you have two CSV files and want to join them the sql-way.
In that case we're talking about structured data, and, yeah, Perl or Python would be easier to work with. That said, depending on the complexity of the CSV, you can still go a long way with plain Bash with IFS/read(1) or tr(1) to split CSV columns. This wouldn't be very robust, but there are tools that handle CSV specifically[1], which can be composed in a shell script just fine.
So it's always a balancing act of being productive quickly with a shell script, or reaching out for a programming language once the tools aren't a good fit, or maintenance becomes an issue.
[1]: https://miller.readthedocs.io/
-
Need help on cleaning this data!!
where mlr is from https://github.com/johnkerl/miller
-
Running weekly average
if this class of problems (i.e., csv/tsv data) is your main target you may find miller (https://github.com/johnkerl/miller) much more useful in the long run
-
GQL: A new SQL like query language for .git files written in Rust
That said, you may be interested in Miller (https://github.com/johnkerl/miller) which provides similar capabilities for CSV, JSON, and XML files. It doesn't use a SQL grammar, but that's just the proverbial lipstick on the thing. I'm not the author, but I have used it and I see some parallels in use cases at the very least.
- johnkerl/miller: Miller is like awk, sed, cut, join, and sort for name-indexed data such as CSV, TSV, and tabular JSON
-
Any cli utility to create ascii/org mode tables?
worth giving Miller a shot
-
I wrote this iCalendar (.ics) command-line utility to turn common calendar exports into more broadly compatible CSV files.
CSV utilities (still haven't pick a favorite one...): https://github.com/harelba/q https://github.com/BurntSushi/xsv https://github.com/wireservice/csvkit https://github.com/johnkerl/miller
- Miller: Like Awk, sed, cut, join, and sort for CSV, TSV, and tabular JSON
What are some alternatives?
ocaml-containers - A lightweight, modular standard library extension, string library, and interfaces to various libraries (unix, threads, etc.) BSD license.
visidata - A terminal spreadsheet multitool for discovering and arranging data
vnlog - Process labelled tabular ASCII data using normal UNIX tools
xsv - A fast CSV command line toolkit written in Rust.
DataProfiler - What's in your data? Extract schema, statistics and entities from datasets
jq - Command-line JSON processor [Moved to: https://github.com/jqlang/jq]
dasel - Select, put and delete data from JSON, TOML, YAML, XML and CSV files with a single tool. Supports conversion between formats and can be used as a Go package.
rq - Record Query - A tool for doing record analysis and transformation
csvtk - A cross-platform, efficient and practical CSV/TSV toolkit in Golang
jq - Command-line JSON processor
yq - yq is a portable command-line YAML, JSON, XML, CSV, TOML and properties processor