The Art of Logging. Are logs for humans? Or machines? Or both?

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
  • json-logs

    A tool to pretty-print JSON logs, like those from zap or logrus.

  • I think the philosophy for logs is that they are THE user interface for operators of your service. If you are SaaS, that's you! If you provide software for self-hosting, that's your customers. It can't be an afterthought; this is how ordinary people can peer deep into your system to understand what's wrong with it. If they figure it out, they don't page you. If it's useless, you can't fix it after they page you. Be paranoid and make your subsystems show their work. Anything you want to know while debugging, print it, don't make the user "hey just run this and tell me what it returns". I think there is an art here, and many treat it as an afterthought.

    I always liked structured logs because it frees my mind from coming up with how I want to format arguments. "New HTTP request for /foobar from 1.2.3.4 with request-id ac31a9e0-5d57-44de-9e98-60fa94d3e866" is a pain to read and maintain. `log.Debug("incoming http request", logger.IP("remote-ip", req.Address), logger.String("request-id", req.ID), logger.String("route", req.Route))` is easy to write, and the resulting logs are easy to analyze!

    I do like myself some pretty colorful logs, but prefer JSON as the intermediary. It's pretty easy to post-process the logs into something understandable. I wrote https://github.com/jrockway/json-logs for this task. Give it a stream of JSON logs, get colorful text out. Add/drop fields or select/highlight interesting lines with JQ or regular expressions (with -A -B -C of course). Installable on Linux or Mac with Homebrew ;) It's one of the few pieces of software I wrote for myself that passes the "toothbrush test". I use it twice a day every day. And that's a good day when I'm not pulling my hair out over an obscure issue ;)

    > Of course in a perfect world you want to ingest these logs into some better searching tool that lets you run SQL or some other standard query language over the set to prune them by date and pull out specific fields but there are times when you'll be hanging on by your nails with nothing to help diagnose the issue except an SSH connection to the server and the log file open in vi.

    That tool is the Logfile Navigator (https://lnav.org). It's a TUI that reads plaintext logs, JSON-lines logs, and others. It will collate messages from all the files into a single log view that you can filter and search. The logs are also plumbed through SQLite virtual tables, so you can do fancy queries.

    When setting up ELK or paying for Splunk/Sumologic is too much, lnav is a pretty good alternative.

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

    WorkOS logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts