s3sha256sum
hashit
s3sha256sum | hashit | |
---|---|---|
4 | 6 | |
10 | 53 | |
- | - | |
5.3 | 2.7 | |
5 months ago | 5 months ago | |
Go | Go | |
GNU General Public License v3.0 only | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
s3sha256sum
-
Updating your programs for S3 Express One Zone
s3sha256sum is a program that calculates SHA-256 checksums of S3 objects. I wrote this program before AWS launched their own feature to support checksums (which you should definitely be using as it makes your upload faster!).
-
Introducing s3verify: verify that a local file is identical to an S3 object without having to download the object data
I hope that s3verify will be useful to you. Please file an issue in the GitHub repository if you have any problems using it. It is a perfect companion to my earlier S3 programs, shrimp and s3sha256sum.
-
Stable versions of shrimp and s3sha256sum
It has been almost a year since I introduced shrimp and s3sha256sum, and I think it is time to announce the changes that have been made since then.
-
Introducing shrimp and s3sha256sum
The program is called s3sha256sum, and the name should be familiar to many of you. As the name implies, it calculates SHA256 checksums of objects on Amazon S3. It uses a normal GetObject request and streams the object contents to the SHA256 hashing function. This way there is no need to download the entire object to your hard drive. You can verify very large objects without worrying about running out of local storage.
hashit
-
Best way to verify data - mass file checksum compare
Alternatively instead of using hashdeep use "hashit": https://github.com/boyter/hashit
-
Criticism please: Is there a better way to log checksums of all my files?
For Windows I typically use hashdeep. Although I did come across hashit on github which is quite a bit faster: https://github.com/boyter/hashit
-
An open source file Hasher AND Verifier?
I did find this alternative: https://github.com/boyter/hashit/releases/tag/v1.1.0
-
How would you organise about 10 old hard drives?
I personally use hashdeep now with sha256 (well, recently discovered hashit - https://github.com/boyter/hashit and export to hashdeep format, and wrote my own script to compare log files for duplicates and potential issues). But crccheckcopy is a quick and simple way to verify your data and locate duplicates.
-
Drive Integrity Software
Hashit: https://github.com/boyter/hashit
-
create a hash for files inside a folders
hashit (Linux, Windows, it's GO code so compile as you wish) - https://github.com/boyter/hashit
What are some alternatives?
super-dollop - Encrypt your files or notes by your GPG key and save to MinIO or AWS S3 easily!
quickhash - Graphical cross platform data hashing tool for Linux, Windows and Mac
shrimp - Simple program that reliably uploads large files to Amazon S3. :shrimp:
blake3 - An AVX-512 accelerated implementation of the BLAKE3 cryptographic hash function
go-web-dynamo-starter - Aims to be a starting point for dynamodb based serverless fun
xsum - Checksums with Merkle trees and concurrency
s3verify - Verify that a local file is identical to an object on Amazon S3, without having to download the object. :detective:
FileVerification - Generates a hash of all files in a folder tree and stores the hashes in a text file in each folder.
collisions - Hash collisions and exploitations
go-benchmarks - Comprehensive and reproducible benchmarks for Go developers and architects.
czkawka - Multi functional app to find duplicates, empty folders, similar images etc.