35M Hot Dogs: Benchmarking Caddy vs. Nginx

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
  • website

    The Caddy website (by caddyserver)

  • I wrote up a few notes on my Caddy setup here https://muxup.com/2022q3/muxup-implementation-notes#serving-... which may be a useful reference if you have a static site and wanted to tick off a few items likely on your list (brotli, http3, cache-control, more fine-grained control on redirects).

    I don't think performance is ever going to matter for my use case, but one thing I think is worth highlighting is the quality of the community and maintainership. In a thread I started asking for feedback on my Caddyfile (https://caddy.community/t/suggestions-for-simplifying-my-cad...), mholt determined I'd found a bug and rapidly fixed it. I followed up with a PR (https://github.com/caddyserver/website/pull/264) for the docs to clarify something related to this bug which was reviewed and merged within 30 minutes.

  • haproxy-lua-acme

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • Vegeta

    HTTP load testing tool and library. It's over 9000!

  • Caddy

    Fast and extensible multi-platform HTTP/1-2-3 web server with automatic HTTPS

  • This is a great writeup overall. However, please note that these tests are also being revised shortly after some brief feedback [0]:

    - The sendfile tests at the end actually didn't use sendfile, so expect much greater performance there.

    - All the tests had metrics enabled, which are known[1] to be quite slow. From my own tests, when I remove metrics code, Caddy is 10-20% faster.

    [0]: https://twitter.com/mholt6/status/1570442275339239424 (thread)

    [1]: https://github.com/caddyserver/caddy/issues/4644

  • nginx-adapter

    Run Caddy with your NGINX config

  • *and memory safety*

    This cannot be understated. Caddy is not written in C! And it can even run your NGINX configs. :) https://github.com/caddyserver/nginx-adapter

  • h2o

    H2O - the optimized HTTP/1, HTTP/2, HTTP/3 server

  • h2o [1] was excellent when I tried it for TLS termination. And it got http/2 priorities right. It's a shame they don't make regular releases.

    1. https://github.com/h2o/h2o/

  • haproxy

    HAProxy Load Balancer's development branch (mirror of git.haproxy.org)

  • It does not, because HAProxy does not perform any disk access at runtime and thus would be unable to persist the certificates anywhere. Disks accesses can be unpredictably slow and would block the entire thread which is not something you want when handling hundreds of thousands of requests per second.

    See this issue and especially the comment from Lukas Tribus: https://github.com/haproxy/haproxy/issues/1864

    Disclosure: Community contributor to HAProxy, I help maintain HAProxy's issue tracker.

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

    WorkOS logo
  • souin

    An HTTP cache system, RFC compliant, compatible with @tyktechnologies, @traefik, @caddyserver, @go-chi, @bnkamalesh, @beego, @devfeel, @labstack, @gofiber, @go-goyave, @go-kratos, @gin-gonic, @roadrunner-server, @zalando, @zeromicro, @nginx and @apache

  • It's always been blisteringly fast when we've used it, and I like the power of the configuration (it has its quirks but so do most powerful systems). But the overhead of setting it up and maintaining it due to having to handle TLS termination separately puts me off using it when other software is 'good enough'. If Varnish Enterprise was cheaper I would have bought it, but at their enterprise prices no way.

    I'm keeping a watching brief on https://github.com/darkweak/souin and its Caddy integration to see if that can step up and replace Varnish for short-lived dynamic caching of web applications. Though I've lost track of its current status.

  • concurrency-limits

  • Under heavy congestion the goodput (useful throughput) will drop to 0 because client will start timing-out before the server get a chance to respond.

    Caddy should use an algorithm similar to : https://github.com/Netflix/concurrency-limits

    Basically check what was the best request latency, and decrease concurrency limit until latency stop improving.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts