quickhash
hashit
quickhash | hashit | |
---|---|---|
6 | 6 | |
352 | 45 | |
- | - | |
6.7 | 2.7 | |
6 months ago | 3 months ago | |
Pascal | Go | |
GNU General Public License v3.0 only | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
quickhash
-
Tried Quick Hash GUI and comand line md5 but OS is so locked sown it wont let me (even as superused in Terminal)
First I tried https://www.quickhash-gui.org/, would not run (no helpfull error just said itwould not run) and no bottons on popup. I then did some googeling and set the following
- Why are there no checksum apps for MacOS?
-
What's the best value method to store 60TB of data and protect against data decay (bit rot)?
But you could try something like https://www.quickhash-gui.org
-
Why does an original quality photo lose quality when downloaded from Google Photos?
Quick Hash on Windows/Mac or Solid Explorer on Android (under the file properties section) can compare the checksums of two files to see if they are the same.
- An open source file Hasher AND Verifier?
- Trying to find program to check multiple hashes.
hashit
-
Best way to verify data - mass file checksum compare
Alternatively instead of using hashdeep use "hashit": https://github.com/boyter/hashit
-
Criticism please: Is there a better way to log checksums of all my files?
For Windows I typically use hashdeep. Although I did come across hashit on github which is quite a bit faster: https://github.com/boyter/hashit
-
An open source file Hasher AND Verifier?
I did find this alternative: https://github.com/boyter/hashit/releases/tag/v1.1.0
-
How would you organise about 10 old hard drives?
I personally use hashdeep now with sha256 (well, recently discovered hashit - https://github.com/boyter/hashit and export to hashdeep format, and wrote my own script to compare log files for duplicates and potential issues). But crccheckcopy is a quick and simple way to verify your data and locate duplicates.
-
Drive Integrity Software
Hashit: https://github.com/boyter/hashit
-
create a hash for files inside a folders
hashit (Linux, Windows, it's GO code so compile as you wish) - https://github.com/boyter/hashit
What are some alternatives?
gtkhash - A cross-platform desktop utility for computing message digests or checksums
blake3 - An AVX-512 accelerated implementation of the BLAKE3 cryptographic hash function
collisions - Hash collisions and exploitations
xsum - Checksums with Merkle trees and concurrency
cshatag - Detect silent data corruption under Linux using sha256 stored in extended attributes
FileVerification - Generates a hash of all files in a folder tree and stores the hashes in a text file in each folder.
Checksums - macOS workflow and shell script to calculate or automatically verify checksums for files or folder contents
fhash - fHash - an open source files hash calculator for Windows and macOS
go-benchmarks - Comprehensive and reproducible benchmarks for Go developers and architects.
czkawka - Multi functional app to find duplicates, empty folders, similar images etc.