Which compression algorithm/file type for backups?

This page summarizes the projects mentioned and recommended in the original post on /r/DataHoarder

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
  • zpaqfranz

    Deduplicating archiver with encryption and paranoid-level tests. Swiss army knife for the serious backup and disaster recovery manager. Ransomware neutralizer. Win/Linux/Unix

  • ZPAQ1 is great, it supports incremental backups and compression algorithm is really good even on already compressed data like images. zpaqfranz2 is a fork that adds more advanced checksumming algorithms. It's also worth looking into.

  • par2cmdline

    Official repo for par2cmdline and libpar2

  • Or use parchive to generate parity files, and preferably store these on a different drive or optical disc.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts