replace-response
caddy-ratelimit
replace-response | caddy-ratelimit | |
---|---|---|
1 | 4 | |
88 | 184 | |
- | - | |
4.9 | 6.4 | |
5 months ago | 7 days ago | |
Go | Go | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
replace-response
-
The Future of Nginx: Getting Back to Our Open Source Roots
> But there are many scenarios where being able to extend the HTTP server via Lua is more convenient than writing a plugin I would think?
Well, Caddy is written in Go, so it's only natural to write a plugin in Go. Statically compiled into your binary. We provide a tool called `xcaddy` which is used to produce builds of Caddy with any plugins you need. You just need Go installed on your system to run it, no other dependencies.
The reason why Lua is used for OpenResty is because writing plugins in C is... not fun.
You can absolutely do what you described with an HTTP handler module in Caddy. You'd just wrap the req.Body with a reader that watches the bytes as they're copied through the stream, and when you see the part you want to log, you do that.
We have a replace-response plugin which takes a similar approach, except it manipulates the response as it's being streamed back to the client. https://github.com/caddyserver/replace-response The whole plugin is just one file of Go code.
caddy-ratelimit
-
Deploying Web Apps with Caddy: A Beginner's Guide Caddy
You can rate limit HTTP requests (agnostic of specific HTTP versions): https://github.com/mholt/caddy-ratelimit
-
The Future of Nginx: Getting Back to Our Open Source Roots
If you can't already do it with the rate limit module I wrote, open an issue with your detailed requirements: https://github.com/mholt/caddy-ratelimit -- should be pretty straightforward for the most part.
> QoS for a shared-multitenant system, in the presence of customers with really badly tuned and spiky request workloads, whose traffic you must nevertheless mostly accept.
Yah, we see that sometimes. Caddy usually handles it fine, sometimes with a bit of massaging the config.
- Nginx Modern Reference Architectures
-
Interactive Halloween decorations with raspberry pi's 🎃
I'm using caddy for my web server due to it's awesome automatic certificate functionality using Let's Encrypt behind the scenes. Caddy supports rate limiting via this plugin so I don't have to worry about folks killing my API.
What are some alternatives?
cache-handler - Distributed HTTP caching module for Caddy
Caddy - Fast and extensible multi-platform HTTP/1-2-3 web server with automatic HTTPS
nginx-cluster - A horizontally scalable NGINX caching cluster
caddy-authorize - Authorization Plugin for Caddy v2 (JWT/PASETO)
server-side-tls - Server side TLS Tools
xcaddy - Build Caddy with plugins
caddy-l4 - Layer 4 (TCP/UDP) app for Caddy
caddy-auth-portal - Authentication Plugin for Caddy v2 implementing Form-Based, Basic, Local, LDAP, OpenID Connect, OAuth 2.0 (Github, Google, Facebook, Okta, etc.), SAML Authentication. MFA with App Authenticators and Yubico.
kubernetes-ingress - NGINX and NGINX Plus Ingress Controllers for Kubernetes
pumpkin-pi - Raspberry pi project that controls jack-o-lantern via servo motor and PIR motion sensors
lua-nginx-module - Embed the Power of Lua into NGINX HTTP servers
witchonstephendrive.com - A home automation project to control my Halloween decorations