why are compressed, deduped backups so big? (vorta & borg backup)

This page summarizes the projects mentioned and recommended in the original post on /r/linuxquestions

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
  • zstd

    Zstandard - Fast real-time compression algorithm

  • Borg also supports compression, but compressing real-world data cannot do miracles. The most aggressive algorithm used by borg is zstd, which, according to the first benchmark I found online, has a compression ratio of less than 3. This means that compressed data can be reduced on average to ⅓ of original data. It would take an extremely inefficient data format to achieve a significantly better ratio. Images and .git directories are rarely stored so inefficiently, so they cannot be further compressed.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts