ccheck
zfs-to-glacier
ccheck | zfs-to-glacier | |
---|---|---|
5 | 2 | |
26 | 38 | |
- | - | |
0.0 | 1.9 | |
about 3 years ago | 11 months ago | |
Perl | Rust | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
ccheck
-
Anyone know what causes intermittent corruption of random visual media files across drives and machines?
Grab a friends computer and amass a large batch of good known files, make sure they are of all different file formats. I am pretty sure you will be able to find entire archives of test data in different formats online, to really reproduce this I am going to assume it should be multiple gb in size. Make sure it contains jpg, videos, text files, pdfs, etc. Now write a script or use some tool like this (https://github.com/jwr/ccheck) to basically compute the sha256 checksum of every file in this test package and write it to a file. Take this package of files and copy them to as many media sources as you have access to, CD/DVDs are great, thumb drive, your laptop, a nas with ZFS (and ECC ram) would be amazing, probably throw it up on cloud storage just to be safe. I would then have the same script run as a cron job, maybe on your main machine to basically continuously check that checksums match their original value. As soon as you notice a checksum mismatch you will want to isolate that file and locate the same one across all the other systems and do a deeper inspection. Open it up in a HEX editor and do a bit by bit comparison to see were the corruption occurred and how bad it is. This will start to give you a better picture of what may be going on.
-
Show HN: Off-site, encrypted backups for $1/TB/month at 99.999999999% durability
Here's my "me too" — I've been happily using rclone for things like photo archives (together with my small consistency checker to check file hashes for corruption https://github.com/jwr/ccheck). I also use Arq Backup with B2 as the destination. This gives me very reasonable storage costs and backups I can access and test regularly.
-
What Happened to Perl 7?
Perl is very well suited for certain tasks (not large software systems, but programs that process data). It is also one of very few languages/ecosystems where you can expect your code to work after >10 years. This is why I sometimes use it, for example my fs consistency checker (https://github.com/jwr/ccheck) was written in Perl specifically because it's a long-term tool and I would like to be able to run it on any system in 15 years.
Compare this long-term approach with the fires in Python or (heaven forbid) Node ecosystems, where things break all the time.
-
I Nearly Lost the Lightroom Catalog with All My Photos
This sort of thing scares me. It's why I started running consistency checks on my important archives (like my photo library), which I keep backed up in multiple places. We tend to think that in a digital world bits are just bits and do not get corrupted — which is decidedly untrue.
I wrote my own consistency checker, as I wasn't happy with what was out there. I wanted it to be simple, and maintainable in the long term (>10 years horizon). See https://github.com/jwr/ccheck if you need something like this. I now update my checksums regularly and check for corruption.
-
How do I safely store my files?
Good point about bitrot. This is why I wrote ccheck.pl (https://github.com/jwr/ccheck) — I wanted to be able to check and detect bitrot in a way that depends on as little technology as possible.
zfs-to-glacier
-
Show HN: Off-site, encrypted backups for $1/TB/month at 99.999999999% durability
https://github.com/andaag/zfs-to-glacier
I built something similar a while back that I've been using for years now.
Something worth noting. There is a minimum cost to files. If you have tons of tiny kb sized files (incremental snapshots..) it's drastically cheaper to fallback to s3 for them.
-
Best way to backup a ZFS pool or its data sets to the cloud
ZFS to Glacier
What are some alternatives?
glacier_deep_archive_backup - Extremely low cost, off-site backup/restore using AWS S3 Glacier Deep Archive
voidvault - Bootstrap Void with FDE
zfs-to-aws
darktable - darktable is an open source photography workflow application and raw developer
rclone - "rsync for cloud storage" - Google Drive, S3, Dropbox, Backblaze B2, One Drive, Swift, Hubic, Wasabi, Google Cloud Storage, Azure Blob, Azure Files, Yandex Files
App-perlbrew - Manage perl installations in your $HOME
arq_restore - command-line utility for restoring from Arq backups
berrybrew - Perlbrew for Windows!
roast - 🦋 Raku test suite
zfs-on-mac - My personal ZFS on macOS instructions and scripts
plenv - Perl binary manager