Request-based autoscaling in Kubernetes: scaling to zero

This page summarizes the projects mentioned and recommended in the original post on

Our great sponsors
  • Onboard AI - Learn any GitHub repo in 59 seconds
  • InfluxDB - Collect and Analyze Billions of Data Points in Real Time
  • SaaSHub - Software Alternatives and Reviews
  • metrics-server

    Scalable and efficient source of container resource metrics for Kubernetes built-in autoscaling pipelines.

    A metrics server to store and aggregate metrics (Kubernetes doesn't come with one by default).

  • custom-metrics-apiserver

    Framework for implementing custom metrics support for Kubernetes

    KEDA is an event-driven autoscaler that implements the Custom Metrics API.

  • Onboard AI

    Learn any GitHub repo in 59 seconds. Onboard AI learns any GitHub repo in minutes and lets you chat with it to locate functionality, understand different parts, and generate new code. Use it for free at

  • http-add-on

    Add-on for KEDA to scale HTTP workloads

    KEDA has a special scaler that creates an HTTP proxy that measures and buffers requests before they reach the app.

  • keda

    KEDA is a Kubernetes-based Event Driven Autoscaling component. It provides event driven scale for any container running in Kubernetes

    In Kubernetes, you can have exports, metrics servers and proxy bundled into one with KEDA.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts