dupd
rdfind
dupd | rdfind | |
---|---|---|
1 | 16 | |
109 | 875 | |
- | - | |
0.0 | 4.1 | |
11 months ago | 27 days ago | |
C | C++ | |
GNU General Public License v3.0 only | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
dupd
-
Go Find Duplicates: blazingly-fast simple-to-use tool to find duplicate files
I use and test assorted duplicate finders regularly.
fdupes is the classic (going way way back) but it's really very slow, not worth using anymore.
The four I know are worth trying these days (depending on data set, hardware, file arrangement and other factors, any one of these might be fastest for a specific use case) are https://github.com/jbruchon/jdupes , https://github.com/pauldreik/rdfind , https://github.com/jvirkki/dupd , https://github.com/sahib/rmlint
Had not encountered fclones before, will give it a try.
rdfind
- Rdfind: A utilty to find duplicate files, delete them or replace with hardlinks
-
Self hosted, web gui, file duplication scanner
I use rdfind for this.
-
Is there a Mac app that will allow me to recursively go through thousands of folders, calculate the total folder size, then compare against all other folder sizes, and if the size is identical, delete the newer one?
rdfind is available for macOS; I've been using it on linux: https://github.com/pauldreik/rdfind
-
Deduplication on EXT4
You can use rdfind to find all duplicates in your experiments dir and replace files with hardlinks. This way files will occupy disk space only once and all inode references will be to the same disk location.
- How do I show non-duplicate files across 2 drives?
-
Pip and cargo are not the same
I use rdfind to deal with this: https://github.com/pauldreik/rdfind
- Backing Up Data: Tips/Advice for Tons of Unorganized Data and Duplicate Files from Multiple Sources
-
This has probably happened to all of us at least once
Yeah, I periodically download the full drives and just deduplicate with rdfind hardlinking identical files.
- AMD/Xilinx Vivado rant
-
recommends for de-duplication?
I use rdfind on my Linux NAS. https://github.com/pauldreik/rdfind
What are some alternatives?
fclones - Efficient Duplicate File Finder
fdupes - FDUPES is a program for identifying or deleting duplicate files residing within specified directories.
jdupes - A powerful duplicate file finder and an enhanced fork of 'fdupes'.
go-find-duplicates - Find duplicate files (photos, videos, music, documents) on your computer, portable hard drives etc.
rmlint - Extremely fast tool to remove duplicates and other lint from your filesystem
dupeguru - Find duplicate files
mpifileutils - File utilities designed for scalability and performance.
kindfs - Index filesystem into a database, then easily make queries e.g. to find duplicates files/dirs, or mount the index with FUSE.