Duplicity
Rdiff-backup
Duplicity | Rdiff-backup | |
---|---|---|
7 | 32 | |
50 | 1,038 | |
- | 0.7% | |
0.0 | 8.3 | |
over 12 years ago | 3 days ago | |
Python | Python | |
GNU General Public License v3.0 only | GNU General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Duplicity
-
Restic: Backups Done Right
http://duplicity.nongnu.org/ at least can use PGP public keys. I've used it for a long time and not seen any particular reason to change.
-
Encrypt channel.backup?
There are backup tools with built-in encryption like borg backup or duplicity, these should be fine. If you already have a backup process and it's missing encryption then you should be able to use e.g. age or gpg.
-
What is everyone using to backup their multiple TB's of data?
For my family photos (critical, irreplaceable, on plex), I use duplicity which can make use of Amazon Glacier and Deep Archive for really cheap storage (0.00099 /gb /month no joke) with incremental versioning and client side encryption. Long restore time, but perfect for disaster recovery on data that doesn't change much. Want to set up the same for music (which rarely but sometimes changes, e.g. Correcting tags).
-
What do you wish you knew before starting grad school?
And google docs / apple cloud etc. aren't proper backups. They can cancel your account, be inaccessible, or hacked even. There's software like duplicity that can upload encrypted backups to multiple services, which are handy. But in any case, if you're doing cloud backups, do do redundant local backups too. My setup is I've a USB stick tacked onto a Raspberry Pi computer, and use something called borg to do daily backups over SSH.
- [QUESTION] Simple bash script, using 'expect', to download backups off a server, will connect and only dl 10-15mb of the 10gb file before exiting. Help?
-
Happy World Backup Day!
I have had good success using [Duplicity](http://duplicity.nongnu.org/) via [Duply](https://www.duply.net/) for a few years now. The main point for me is that duplicity directly backs up to many cloud-storage endpoints. I'm using google drive specifically, but it supports a ton of options.
- Duplicity: Encrypted bandwidth-efficient backup using the rsync algorithm
Rdiff-backup
-
Duplicity
For starters it has a tendency to paint itself into a corner on ENOSPC situations. You won't even be able to perform a restore if a backup was started but unfinished because it ran out of space. There's this process of "regressing" the repo [0] which must occur before you can do practically anything after an interrupted/failed backup. What this actually must do is undo the partial forward progress, by performing what's effectively a restore of the files that got pushed into the future relative to the rest of the repository, which requires more space. Unless you have/can create free space to do these things, it can become wedged... and if it's a dedicated backup system where you've intentionally filled disks up with restore points, you can find yourself having to throw out backups just to make things functional again - even ability to restore is affected.
That's the most obvious glaring problem, beyond that it's just kind of garbage in terms of the amount of space and time it requires to perform restores. Especially restores of files having many reverse-differential increments leading back to the desired restore point. It can require 2X the file's size in spare space to assemble the desired version, while it iteratively reconstructs all the intermediate versions in arriving at the desired version. Unless someone fixed this since I last had to deal with it, which is possible.
Source: Ages ago I worked for a startup[1] that shipped a backup appliance originally implemented by contractors using rdiff-backup. Writing a replacement that didn't suck but was compatible with rdiff-backup's repos consumed several years of my life...
There are far better options in 2024.
[0] https://github.com/rdiff-backup/rdiff-backup/blob/master/src...
[1] https://www.crunchbase.com/organization/axcient
-
Trying to install rdiff-backup on an Oracle Cloud Red Hat VM.
and that should install the latest version, rdiff-backup-2.2.4-2.el8.x86_64.rpm. This is all described in the rdiff-backup README file.
- Cache operation: archive
-
How do I copy data from one HDD to another using Linux Mint?
Rdiff-backup - close to what you do currently but at least provides versioning. Based on rsync
-
Accomplishing What I Want With What I Have
as in just a copy of your files? This I would barely consider a backup, more of just a mirror from a point in time. What're you missing by doing this? versions of files, deduplication, and encryption (last one being very important for the best kind of backups, which should be off-site). Just because it's not files doesn't mean it's proprietary. Proprietary would mean secret and undocumented. There are many great options. Borg is my favorite but Kopia is probably better if you use windows, urbackup is an option if you want centralized management of backups and rdiff-backup is if you want something kinda what you have currently but adding versioning but lacks deduplication and encryption.
-
Backup software recommendation
If you're comfortable with the cli and you want to have your backup in a plain file format with some incremental backups, there's rdiffbackup. It uses rsync under the hood and has worked quite well for me.
-
Name a program that doesn't get enough love!
Rdiff Backup - Reverse differential backups that uses rsync, linking, and can tunnel via ssh. You get a full current backup with increments available to restore any version of the file with minimal storage space used.
-
BorgBackup, Deduplicating archiver with compression and encryption
borg is great. we've been using it for the past 3 years to archive hundreds of file-level backups of servers, database dumps and VM images. average size of each borg repo is few GB but there are few outliers up to few hundreds of GB.
borg replaced https://rdiff-backup.net/ for us and gave:
-
Advice for Automated Copying of my Off Grid 6TB Media Hoard :)
Robocopy is great if you don't have access to rsync. If rsync via WSL2 for instance is an option, I'd personally go with rdiffbackup.
- Do incremental backups generally store only the delta of each file change or the entire new file?
What are some alternatives?
BorgBackup - Deduplicating archiver with compression and authenticated encryption.
Duplicati - Store securely encrypted backups in the cloud!
restic - Fast, secure, efficient backup program
Rsnapshot - a tool for backing up your data using rsync (if you want to get help, use https://lists.sourceforge.net/lists/listinfo/rsnapshot-discuss)
TimeShift - System restore tool for Linux. Creates filesystem snapshots using rsync+hardlinks, or BTRFS snapshots. Supports scheduled snapshots, multiple backup levels, and exclude filters. Snapshots can be restored while system is running or from Live CD/USB.
syncthing-android - Wrapper of syncthing for Android.
UrBackup - UrBackup - Client/Server Open Source Network Backup for Windows, MacOS and Linux
Back In Time - Back In Time - An easy-to-use backup tool for GNU Linux using rsync in the back
Duplicacy - A new generation cloud backup tool