mpifileutils
File utilities designed for scalability and performance. (by hpc)
duphard
A simple utility to detect duplicate files and replace them with hard links. (by andmarios)
mpifileutils | duphard | |
---|---|---|
4 | 1 | |
160 | 2 | |
0.6% | - | |
5.1 | 0.0 | |
21 days ago | almost 4 years ago | |
C | Go | |
BSD 3-clause "New" or "Revised" License | GNU General Public License v3.0 only |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
mpifileutils
Posts with mentions or reviews of mpifileutils.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2022-10-17.
-
Pigz: A parallel implementation of gzip for multi-core machines
If you ever run into the limitations of a single machine, dbz2 is also a fun little app for this sort of thing. You can run it across multiple machines and it'll automatically balance the workload across them.
https://github.com/hpc/mpifileutils/blob/master/man/dbz2.1
- MpiFileUtils: File utilities designed for scalability and performance
-
Go Find Duplicates: blazingly-fast simple-to-use tool to find duplicate files
If you want something that scales horizontally, dcmp from https://github.com/hpc/mpifileutils is an option.
- You can list a directory containing 8M files, but not with ls
duphard
Posts with mentions or reviews of duphard.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2021-08-29.
-
Go Find Duplicates: blazingly-fast simple-to-use tool to find duplicate files
For example I maintain a tar file and a docker image with Kafka connectors which share many jar files. Using duphard I can save hundreds of megabytes, or even more than a gigabyte! For a documentation website with many copies of the same image (let's just say some static generators favor this practice for maintaining multiple versions), I can reduce the website size by 60%+, which then makes ssh copies, docker pulls, etc way faster speeding up deployment times.
https://github.com/andmarios/duphard
What are some alternatives?
When comparing mpifileutils and duphard you can also consider the following projects:
fclones - Efficient Duplicate File Finder
rdfind - find duplicate files utility
rmlint - Extremely fast tool to remove duplicates and other lint from your filesystem
pigz - A parallel implementation of gzip for modern multi-processor, multi-core machines.
go-find-duplicates - Find duplicate files (photos, videos, music, documents) on your computer, portable hard drives etc.
coreutils - Enhancements to the GNU coreutils (especiall head)
jdupes - A powerful duplicate file finder and an enhanced fork of 'fdupes'.
czkawka - Multi functional app to find duplicates, empty folders, similar images etc.
dupd - CLI utility to find duplicate files