lrzip
rclone
lrzip | rclone | |
---|---|---|
6 | 963 | |
595 | 44,201 | |
- | 1.9% | |
3.7 | 9.8 | |
23 days ago | 4 days ago | |
C | Go | |
GNU General Public License v3.0 only | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
lrzip
-
How to Get Your Backup to Half of Its Size – ZSTD Support in XtraBackup
lrzip
Long Range ZIP or LZMA RZIP
https://github.com/ckolivas/lrzip
"A compression utility that excels at compressing large files (usually > 10-50 MB). Larger files and/or more free RAM means that the utility will be able to more effectively compress your files (ie: faster / smaller size), especially if the filesize(s) exceed 100 MB. You can either choose to optimise for speed (fast compression / decompression) or size, but not both."
-
File compression
7zip and XZ are almost always the best in any comparison. (They use the same algorithm.) Occasionally something new comes allong that may be bettyer, but it fades away... Like lrzip. https://lkml.org/lkml/2011/6/4/23 https://github.com/ckolivas/lrzip
-
If we found a way to reverse a hashing function, would that make them ultra-compression algorithms?
For example lrzip has an intense "dupe hunting" mode and takes days for large content, but does compress very well once it's done (and expansion is fast). I use it on long term storage backups and disk images and junk. Completely incompatible with streaming, unlike chunk-based like gzip or deflate or etc, although unpacking can stream such as searching or verifying a tarfile archive. But the original source has to be file-based so seeking for the hunting can work across the entire file-as-a-block.
- Lrzip – Long Range Zip or LZMA RZIP
-
Ask HN: How would you store 10PB of data for your startup today?
Best I know of for that is something like lrzip still, but even then it's probably not state of the art. https://github.com/ckolivas/lrzip
It'll also take a hell of a long time to do the compression and decompression. It'd probably be better to do some kind of chunking and deduplication instead of compression itself simply because I don't think you're ever going to have enough ram to store any kind of dictionary that would effectively handle so much data. You'd also not want to have to re-read and reconstruct that dictionary to get at some random image too.
-
Encrypted Backup Shootout
There's also lrzip for large files: https://github.com/ckolivas/lrzip
rclone
-
Supabase Storage: now supports the S3 protocol
rclone: a command-line program to manage files on cloud storage.
- World Backup Day
-
S3 Client against disasters (hacks, fires, catastrophes)
Synchronise buckets with Sclone or Rclone
- Show HN: Query Your Sheets with SheetSQL
-
Rclone syncs your files to cloud storage
Says that Apple doesn't provide a multi platform API. It doesn't provide any official supported way to access iCloud from Windows, Linux.
There's a ticket covering everything you might ever want to know:
https://github.com/rclone/rclone/issues/1778
-
Ask HN: Best modern file transfer/synchronization protocol?
seconding rsync and syncthing.
the server could expose an smb or nfs share, the client could mount it, and then sync to that mount.
rsync over ssh also works, if you do not want to run smb/nfs.
this is also a cool tool https://rclone.org/
-
Ask HN: How do you do personal backups in 2023? (Google and Dropbox issues)
rclone [1] to dropbox. works since years without problems
[1] https://rclone.org/
-
Which synchronization tool are you using together with the pCloud Crypto Folder?
rclone provides a special pCloud config option, which makes the setup straight forward. rclone can encrypt the data it uploads with its own encryption but not with the pCloud encryption. Therefore it can only upload data to the unencrypted pCloud folders, not to the Crypto Folder.
- Backup of Google Drive (and photos?) to local disk (not to Google Drive)
-
All I want for Christmas is
The arkclone project impliments rclone in ArkOS to achieve cloud saves. Not yet built in to ArkOS yet, and not a lot of recent traction on the pull request to get it added, but it can be installed manually.
What are some alternatives?
bupstash - Easy and efficient encrypted backups.
syncthing - Open Source Continuous File Synchronization
rdedup - Data deduplication engine, supporting optional compression and public key encryption.
Cryptomator - Multi-platform transparent client-side encryption of your files in the cloud
duplicity - mirror of duplicity: https://code.launchpad.net/duplicity
rsync - An open source utility that provides fast incremental file transfer. It also has useful features for backup and restore operations among many other use cases.
LeoFS - The LeoFS Storage System
s3fs-fuse - FUSE-based file system backed by Amazon S3
BorgBackup - Deduplicating archiver with compression and authenticated encryption.
Duplicati - Store securely encrypted backups in the cloud!
ParlAI - A framework for training and evaluating AI models on a variety of openly available dialogue datasets.
aws-cli - Universal Command Line Interface for Amazon Web Services