ccheck
zfs-on-mac
ccheck | zfs-on-mac | |
---|---|---|
5 | 1 | |
26 | 78 | |
- | - | |
0.0 | 0.0 | |
about 3 years ago | over 1 year ago | |
Perl | Shell | |
MIT License | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
ccheck
-
Anyone know what causes intermittent corruption of random visual media files across drives and machines?
Grab a friends computer and amass a large batch of good known files, make sure they are of all different file formats. I am pretty sure you will be able to find entire archives of test data in different formats online, to really reproduce this I am going to assume it should be multiple gb in size. Make sure it contains jpg, videos, text files, pdfs, etc. Now write a script or use some tool like this (https://github.com/jwr/ccheck) to basically compute the sha256 checksum of every file in this test package and write it to a file. Take this package of files and copy them to as many media sources as you have access to, CD/DVDs are great, thumb drive, your laptop, a nas with ZFS (and ECC ram) would be amazing, probably throw it up on cloud storage just to be safe. I would then have the same script run as a cron job, maybe on your main machine to basically continuously check that checksums match their original value. As soon as you notice a checksum mismatch you will want to isolate that file and locate the same one across all the other systems and do a deeper inspection. Open it up in a HEX editor and do a bit by bit comparison to see were the corruption occurred and how bad it is. This will start to give you a better picture of what may be going on.
-
Show HN: Off-site, encrypted backups for $1/TB/month at 99.999999999% durability
Here's my "me too" — I've been happily using rclone for things like photo archives (together with my small consistency checker to check file hashes for corruption https://github.com/jwr/ccheck). I also use Arq Backup with B2 as the destination. This gives me very reasonable storage costs and backups I can access and test regularly.
-
What Happened to Perl 7?
Perl is very well suited for certain tasks (not large software systems, but programs that process data). It is also one of very few languages/ecosystems where you can expect your code to work after >10 years. This is why I sometimes use it, for example my fs consistency checker (https://github.com/jwr/ccheck) was written in Perl specifically because it's a long-term tool and I would like to be able to run it on any system in 15 years.
Compare this long-term approach with the fires in Python or (heaven forbid) Node ecosystems, where things break all the time.
-
I Nearly Lost the Lightroom Catalog with All My Photos
This sort of thing scares me. It's why I started running consistency checks on my important archives (like my photo library), which I keep backed up in multiple places. We tend to think that in a digital world bits are just bits and do not get corrupted — which is decidedly untrue.
I wrote my own consistency checker, as I wasn't happy with what was out there. I wanted it to be simple, and maintainable in the long term (>10 years horizon). See https://github.com/jwr/ccheck if you need something like this. I now update my checksums regularly and check for corruption.
-
How do I safely store my files?
Good point about bitrot. This is why I wrote ccheck.pl (https://github.com/jwr/ccheck) — I wanted to be able to check and detect bitrot in a way that depends on as little technology as possible.
zfs-on-mac
-
I Nearly Lost the Lightroom Catalog with All My Photos
2. When the card is full, images are copied in the field on the USB drive. I imagine the user has a 128G SD card and multiple photo sessions will exhaust internal storage quickly, especially on a mac with 256G-512G SSD, hence an external drive. APFS may be used instead of FAT but then the drive won’t be readable on Windows or Linux. Mac doesn’t allow writes to an NTFS filesystem unless you buy a 3rd party driver of unknown reliability. I guess one can try to combine your linked guide with https://github.com/spl/zfs-on-mac to get ZFS on a USB drive but all of that is done at your own risk.
What are some alternatives?
glacier_deep_archive_backup - Extremely low cost, off-site backup/restore using AWS S3 Glacier Deep Archive
darktable - darktable is an open source photography workflow application and raw developer
voidvault - Bootstrap Void with FDE
App-perlbrew - Manage perl installations in your $HOME
berrybrew - Perlbrew for Windows!
roast - 🦋 Raku test suite
plenv - Perl binary manager
deduposaur
zfs-to-aws
Inline-Perl5 - Use Perl 5 code in a Raku program
zfs-to-glacier - A tool to sync zfs snapshots to s3-glacier, written in rust.