Rdiff-backup
Duplicati
Our great sponsors
Rdiff-backup | Duplicati | |
---|---|---|
32 | 22 | |
1,038 | 10,184 | |
2.4% | 2.5% | |
8.5 | 8.6 | |
5 days ago | 4 days ago | |
Python | C# | |
GNU General Public License v3.0 only | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Rdiff-backup
-
Duplicity
For starters it has a tendency to paint itself into a corner on ENOSPC situations. You won't even be able to perform a restore if a backup was started but unfinished because it ran out of space. There's this process of "regressing" the repo [0] which must occur before you can do practically anything after an interrupted/failed backup. What this actually must do is undo the partial forward progress, by performing what's effectively a restore of the files that got pushed into the future relative to the rest of the repository, which requires more space. Unless you have/can create free space to do these things, it can become wedged... and if it's a dedicated backup system where you've intentionally filled disks up with restore points, you can find yourself having to throw out backups just to make things functional again - even ability to restore is affected.
That's the most obvious glaring problem, beyond that it's just kind of garbage in terms of the amount of space and time it requires to perform restores. Especially restores of files having many reverse-differential increments leading back to the desired restore point. It can require 2X the file's size in spare space to assemble the desired version, while it iteratively reconstructs all the intermediate versions in arriving at the desired version. Unless someone fixed this since I last had to deal with it, which is possible.
Source: Ages ago I worked for a startup[1] that shipped a backup appliance originally implemented by contractors using rdiff-backup. Writing a replacement that didn't suck but was compatible with rdiff-backup's repos consumed several years of my life...
There are far better options in 2024.
[0] https://github.com/rdiff-backup/rdiff-backup/blob/master/src...
[1] https://www.crunchbase.com/organization/axcient
-
Trying to install rdiff-backup on an Oracle Cloud Red Hat VM.
and that should install the latest version, rdiff-backup-2.2.4-2.el8.x86_64.rpm. This is all described in the rdiff-backup README file.
- Cache operation: archive
-
How do I copy data from one HDD to another using Linux Mint?
Rdiff-backup - close to what you do currently but at least provides versioning. Based on rsync
-
Accomplishing What I Want With What I Have
as in just a copy of your files? This I would barely consider a backup, more of just a mirror from a point in time. What're you missing by doing this? versions of files, deduplication, and encryption (last one being very important for the best kind of backups, which should be off-site). Just because it's not files doesn't mean it's proprietary. Proprietary would mean secret and undocumented. There are many great options. Borg is my favorite but Kopia is probably better if you use windows, urbackup is an option if you want centralized management of backups and rdiff-backup is if you want something kinda what you have currently but adding versioning but lacks deduplication and encryption.
-
Backup software recommendation
If you're comfortable with the cli and you want to have your backup in a plain file format with some incremental backups, there's rdiffbackup. It uses rsync under the hood and has worked quite well for me.
-
Name a program that doesn't get enough love!
Rdiff Backup - Reverse differential backups that uses rsync, linking, and can tunnel via ssh. You get a full current backup with increments available to restore any version of the file with minimal storage space used.
-
BorgBackup, Deduplicating archiver with compression and encryption
borg is great. we've been using it for the past 3 years to archive hundreds of file-level backups of servers, database dumps and VM images. average size of each borg repo is few GB but there are few outliers up to few hundreds of GB.
borg replaced https://rdiff-backup.net/ for us and gave:
-
Advice for Automated Copying of my Off Grid 6TB Media Hoard :)
Robocopy is great if you don't have access to rsync. If rsync via WSL2 for instance is an option, I'd personally go with rdiffbackup.
- Do incremental backups generally store only the delta of each file change or the entire new file?
Duplicati
-
C# hakkında görüşleriniz ve ödevim
Petabaytlarca veri emanet edilen şu kodun %85'i C# https://github.com/duplicati/duplicati
-
Is there a non-beta version?
For my parents' computer, I'm using the canary version (which should be the alpha version I think?) of Duplicati since years.
-
Nextcloud noob: How can I auto backup photos & files to AWS/iCloud?
It hasn't had a release in a little while but work is still occurring. https://github.com/duplicati/duplicati/actions.
- Most used selfhosted services in 2022?
- Backup Windows PC to Minio/S3
-
Announcing Duplicati Dashboard
Hey have a read at : https://github.com/duplicati/duplicati/issues/4041
-
A Dummies Guide to Duplicati
I just came across this while looking through their issues to see if anyone else had reported the firefox issue i'm running into. I'm starting to have serious reservations.
- Apparently you cannot have the Kanye interview on Google Drive now
-
Borg vs Duplicacy (not Duplicati or Duplicity)?
I like duplicacy because of the way it keeps the chunks in the file system, without a special database. This makes it scale up really well no matter how many backups you have (you can even have multiple computers saved). It's kind of beyond weird how you select what you want to backup with the symlinks (using the command line version), looks more like what one would make for himself in a weekend (not that I'm complaining about free software!) but it's been without bugs for me and extremely efficient. In contrast duplicati has a perfect interface, it's well maintained and everything but bogs down in any large backup, has stories about people recovering for weeks for a very few local TBs and I've experienced for myself this, granted in the python that is checking the sha256 checksums of the backups but it makes it slower many times (possibly hundreds of times), nobody checked from 2013 to 2021 (or did it on tiny datasets like 1GB or was content to wait for weeks even on something small-ish)?
- C# library for centralized cloud storage syncing?
What are some alternatives?
BorgBackup - Deduplicating archiver with compression and authenticated encryption.
restic - Fast, secure, efficient backup program
UrBackup - UrBackup - Client/Server Open Source Network Backup for Windows, MacOS and Linux
Rsnapshot - a tool for backing up your data using rsync (if you want to get help, use https://lists.sourceforge.net/lists/listinfo/rsnapshot-discuss)
Duplicity - Unnoficial fork of Duplicity - Bandwidth Efficient Encrypted Backup
syncthing-android - Wrapper of syncthing for Android.
rclone - "rsync for cloud storage" - Google Drive, S3, Dropbox, Backblaze B2, One Drive, Swift, Hubic, Wasabi, Google Cloud Storage, Azure Blob, Azure Files, Yandex Files
TimeShift - System restore tool for Linux. Creates filesystem snapshots using rsync+hardlinks, or BTRFS snapshots. Supports scheduled snapshots, multiple backup levels, and exclude filters. Snapshots can be restored while system is running or from Live CD/USB.