MultiPar
scorch
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
MultiPar
-
Red Dead Redemption 2 PEDS_10 and rdr3.resident.rpf corrupted files and mismatch
but... depending on how much effort you want to spend fixing it I might be able to fix your current corrupt files, especially if the corrupt files are only mildly corrupt I could create a '.par2' file from mine (assuming you are on build 1436.28), then you would use MultiPar (i.e. https://github.com/Yutaka-Sawada/MultiPar/releases ) to scan your files and it will tell me how many blocks you need to fix it and then I could create recovery data with MultiPar. doing this might dramatically lower the amount of data I would have to upload as then you could use MultiPar which will fix your current corrupt files so they match mine identically.
-
WinRAR zero-day exploited since April to hack trading accounts
MultiPar is the continuation for Windows https://github.com/Yutaka-Sawada/MultiPar
And this for Linux https://github.com/animetosho/par2cmdline-turbo
-
It‘s been more than 2 years that animetosho plead uploaders to stop RAR‘in their Usenet uploads. What are Your thoughts on this in 2023?
* MultiPar supports in-situ repair, which bypasses this problem entirely
-
unRAID Under the premise of Parity Disk, how to specify to rebuild the specified directory or file?
To detect bitrot I use dynamix file integrity plugin, it compute Blake3 hash for each file, it is stored as extend attribute. You Can choose which share will be covered. For correction i then use par2 files using multipar https://hp.vector.co.jp/authors/VA021385/ https://github.com/Yutaka-Sawada/MultiPar (Windows only, par2cmdline for Linux. You could compile from source and add it to unraid if you want, i've done it for testing purpose)
- How does Multi Par works?
-
How to ensure file integrity?
And last but not less important - make regular parity files of your healthy important data, and store them somewhere. MultiPar is your best friend. Even if your data gets corrupt, with parity files you can restore it to it's former glory. Some limitations will apply, read the manual.
-
Is there a way to quickly create PAR2 files from the Windows context menu?
This is possible with MultiPar https://github.com/Yutaka-Sawada/MultiPar
-
I need to store a 1MB file for 20 years, would making thousands of copies on a 4.7gb DVD be enough?
More sanely, perhaps, get some M-Disc DVDs (if you can find real ones - apparently they're becoming just normal DVDs) and put one file on each, and then run https://github.com/Yutaka-Sawada/MultiPar on it to create parity files. Then store the file, the parity files, and the software you need (MultiPar) on the disc. To be safe, you probably also need to store the DVD burner you used to make it, as those are getting thin on the ground these days.
-
Best archival medium for 3-10 gig video files?
Note about par2: if one is using a windows system with a nvidia GPU this version of par2 is faster as it speaks CUDA.
-
Usenet vs Torrents for more results
For any file repairs use multipar, it's still being updated and has gpu acceleration. https://github.com/Yutaka-Sawada/MultiPar/releases
scorch
-
How do I ensure that I do not get a time-delayed ransomware attack?
The method I use is to run scorch every night to compute hashes for new files and check around 12% of old files for hash errors every night. Even if your backup is the same day as a ransomware attack, you will still catch it if the attack hits enough files for one to get randomly scrubbed. Also scorch is designed around making the hash database small and independent from the rest of the system, so you can automate copying it to a bunch of different places.
- Does this not exists? Checksum program...
-
ZFS or BTRFS for raid0 + backup server
Lastly, you could just point scorch (https://github.com/trapexit/scorch) at your drives and run it on a cron or systemd timer - just have the script alert you with an e-mail or whatever your preferred method is. Not ideal but probably less work than rebuilding two arrays because you don't like the format of error messages.
-
Embarking on my hoarding journey
If you really care, you can use something like scorch or file-digests to get the hashes of your files and just store that in a text file, recalculating monthly. No need to get fancy with it. Hell, write your own simple script that hashes, outputs to file, and checks previous versions.
-
Tool to add checksum to files on EXT4 and verify them.
Not exactly what you're looking for but close -> https://github.com/trapexit/scorch
-
Tool to compare file set against a list of hashes and import new/unique files
Scorch should fit the bill (https://github.com/trapexit/scorch)
- Generate hash for all files in all folders and subfolders on HDD
- Manual File Indexing
- Manual file indexing on my NAS
What are some alternatives?
dvdisaster - A tool providing additional ECC protection for optical media (unofficial version)
cshatag - Detect silent data corruption under Linux using sha256 stored in extended attributes
snapraid - A backup program for disk arrays. It stores parity information of your data and it recovers from up to six disk failures
file-digests - 📐 A tool to check if there are any changes in your files by storing and later checking their digests/hashes (BLAKE2b512, SHA3-256, or SHA512-256).
ParParGUI - GUI front-end to ParPar, a PAR2 creation tool
znapzend - zfs backup with remote capabilities and mbuffer integration.
Nyuu - Flexible usenet binary posting tool
CalCorrupt - File corrupter using PyQt5
par2cmdline - Official repo for par2cmdline and libpar2
HashCheck - HashCheck Shell Extension for Windows with added SHA2, SHA3, and multithreading; originally from code.kliu.org
HBBatchBeast - A free GUI application for HandBrake and FFmpeg/FFprobe with an emphasis on batch conversion (including recursive folder scans and folder watching) -Windows, macOS, Linux & Docker
honst - Fixes your dataset according to your rules.