lrzip
bupstash
lrzip | bupstash | |
---|---|---|
6 | 11 | |
595 | 873 | |
- | - | |
3.7 | 1.3 | |
15 days ago | 3 months ago | |
C | Rust | |
GNU General Public License v3.0 only | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
lrzip
-
How to Get Your Backup to Half of Its Size – ZSTD Support in XtraBackup
lrzip
Long Range ZIP or LZMA RZIP
https://github.com/ckolivas/lrzip
"A compression utility that excels at compressing large files (usually > 10-50 MB). Larger files and/or more free RAM means that the utility will be able to more effectively compress your files (ie: faster / smaller size), especially if the filesize(s) exceed 100 MB. You can either choose to optimise for speed (fast compression / decompression) or size, but not both."
-
File compression
7zip and XZ are almost always the best in any comparison. (They use the same algorithm.) Occasionally something new comes allong that may be bettyer, but it fades away... Like lrzip. https://lkml.org/lkml/2011/6/4/23 https://github.com/ckolivas/lrzip
-
If we found a way to reverse a hashing function, would that make them ultra-compression algorithms?
For example lrzip has an intense "dupe hunting" mode and takes days for large content, but does compress very well once it's done (and expansion is fast). I use it on long term storage backups and disk images and junk. Completely incompatible with streaming, unlike chunk-based like gzip or deflate or etc, although unpacking can stream such as searching or verifying a tarfile archive. But the original source has to be file-based so seeking for the hunting can work across the entire file-as-a-block.
- Lrzip – Long Range Zip or LZMA RZIP
-
Ask HN: How would you store 10PB of data for your startup today?
Best I know of for that is something like lrzip still, but even then it's probably not state of the art. https://github.com/ckolivas/lrzip
It'll also take a hell of a long time to do the compression and decompression. It'd probably be better to do some kind of chunking and deduplication instead of compression itself simply because I don't think you're ever going to have enough ram to store any kind of dictionary that would effectively handle so much data. You'd also not want to have to re-read and reconstruct that dictionary to get at some random image too.
-
Encrypted Backup Shootout
There's also lrzip for large files: https://github.com/ckolivas/lrzip
bupstash
-
Kopia: Open-Source, Fast and Secure Open-Source Backup Software
bupstash supports it, however I didn't try it out
https://github.com/andrewchambers/bupstash/blob/master/doc/g...
-
Backups in NixOS
bupstash
-
BorgBackup, Deduplicating archiver with compression and encryption
I tried a few backup tools and https://github.com/andrewchambers/bupstash is my favorite by far but it's not that well known.
It was pretty fast already and recently got multithread support. It has been the only thing usable for backing up a few TB in a raspberry for performance reasons.
Keep in mind it's relatively new and the author does not yet recommend to use in production as the only backup solution.
- Using Git For Backups
- Restic: Backups Done Right
- Deduplicating Archiver with Compression and Encryption
-
Encrypted Backup Shootout
bupstash (rust) - https://github.com/andrewchambers/bupstash
The authors bupstash[1] tool looks interesting.
I see there's an issue made for Windows support, how is that with Rust?
Unless it's doing low-level stuff like directory monitoring I assumed Rust would be quite portable?
[1]: https://github.com/andrewchambers/bupstash
-
What's everyone working on this week (53/2020)?
Benchmarking my backup tool that was written in rust: https://github.com/andrewchambers/bupstash . Rust did not disappoint when it comes to performance, it seems to beat restic by a factor of 2x-200x depending on the benchmark.
What are some alternatives?
rdedup - Data deduplication engine, supporting optional compression and public key encryption.
kopia - Cross-platform backup tool for Windows, macOS & Linux with fast, incremental backups, client-side end-to-end encryption, compression and data deduplication. CLI and GUI included.
duplicity - mirror of duplicity: https://code.launchpad.net/duplicity
Bup - Very efficient backup system based on the git packfile format, providing fast incremental saves and global deduplication (among and within files, including virtual machine images). Please post problems or patches to the mailing list for discussion (see the end of the README below).
LeoFS - The LeoFS Storage System
restic - Fast, secure, efficient backup program
BorgBackup - Deduplicating archiver with compression and authenticated encryption.
ParlAI - A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
tarsnap - Command-line client code for Tarsnap.
nfreezer - nFreezer is an encrypted-at-rest backup tool.
rclone - "rsync for cloud storage" - Google Drive, S3, Dropbox, Backblaze B2, One Drive, Swift, Hubic, Wasabi, Google Cloud Storage, Azure Blob, Azure Files, Yandex Files