How to ensure file integrity?

This page summarizes the projects mentioned and recommended in the original post on /r/DataHoarder

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
  • HBBatchBeast

    A free GUI application for HandBrake and FFmpeg/FFprobe with an emphasis on batch conversion (including recursive folder scans and folder watching) -Windows, macOS, Linux & Docker

    Videos can be read with hbbatchbeast, for example - just download and run version first, and then latest, to create proper config files.

  • zpaqfranz

    Deduplicating archiver with encryption and paranoid-level tests. Swiss army knife for the serious backup and disaster recovery manager. Ransomware neutralizer. Win/Linux/Unix

    Now, onto files backup - if you value your data, don't make just one backup copy, make two or three. Also, I'd recommend using software that will make snapshots and you could restore whichever version you need. I am using zpaqfranz for few years now, it is command line software but you can make batch file and update the archive when needed - it will add only new and changed files, so only first backup will last long.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

  • MultiPar

    Parchive tool

    And last but not less important - make regular parity files of your healthy important data, and store them somewhere. MultiPar is your best friend. Even if your data gets corrupt, with parity files you can restore it to it's former glory. Some limitations will apply, read the manual.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts