fdupes
rmlint
Our great sponsors
fdupes | rmlint | |
---|---|---|
17 | 16 | |
2,354 | 1,768 | |
- | - | |
2.5 | 5.8 | |
about 2 months ago | 4 months ago | |
C | C | |
- | GNU General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
fdupes
- Fdupes: Identify or Delete Duplicate Files
- fdupes: Identify or Delete Duplicate Files
-
Removing image duplicates
fdupes is simple and easy to use: https://github.com/adrianlopezroche/fdupes
- Backing Up Data: Tips/Advice for Tons of Unorganized Data and Duplicate Files from Multiple Sources
-
File Deduplication
I recently used [fdupes](https://github.com/adrianlopezroche/fdupes) to figure out duplicate files from my amazon cloud drive / photos migration. Took about 2 days to scour through about 1.5TB worth of day.
-
How would I go about copying around 5TB worth of data, from multiple drives to a singular drive/drives (Shared Pools/Raid)?
I would add the content of your current drives with rysnc to the new big drive. I would then run https://github.com/adrianlopezroche/fdupes To remove duplicate files.
- Ask HN: Tool to find identical file subtrees scattered over disks
-
Which tool do you use to find duplicate files?
jdupes, an optimized fork of the popular fdupes. There's 32-bit and 64-bit Win32 packages of jdupes there on Github.
- Mercredi Tech - 2022-05-11
- Suggestions on how to identify & report on old stale data in file shares?
rmlint
-
fdupes: Identify or Delete Duplicate Files
My preferred solution is rmlint [https://github.com/sahib/rmlint] mostly because it also looks at duplicate directories. It produces a bash script instead of deleting anything itself, so you can examine it before running the script it made.
-
ZFS 2.2.0 (RC): Block Cloning merged
After I removed duplicates (with help of https://github.com/sahib/rmlint ), I migrated my photos to an ordinary zpool instead.
-
I decluttered 14,000 digital items within a few hours. Here's how I did it.
For the technically savvy among you there is an excellent open source program called ‘rmlint’ (aka. Remove Lint). It is excellent at finding duplicates and saved me terabytes of space.
-
Looking for Powerful Deduplication software
You don’t say if you are on Windows or Unix. I have used rmlint successfully in the past.
-
the very best anti-duplicate app ?
dupeguru or rmlint
-
deleting duplicates programs?
rmlint, my friend, is the last tool you will ever need for this
- script to remove redundant parent directories
- Is there software that scans for duplicates?
- data hoarding software
-
Go Find Duplicates: blazingly-fast simple-to-use tool to find duplicate files
I use and test assorted duplicate finders regularly.
fdupes is the classic (going way way back) but it's really very slow, not worth using anymore.
The four I know are worth trying these days (depending on data set, hardware, file arrangement and other factors, any one of these might be fastest for a specific use case) are https://github.com/jbruchon/jdupes , https://github.com/pauldreik/rdfind , https://github.com/jvirkki/dupd , https://github.com/sahib/rmlint
Had not encountered fclones before, will give it a try.
What are some alternatives?
rdfind - find duplicate files utility
jdupes - A powerful duplicate file finder and an enhanced fork of 'fdupes'.
czkawka - Multi functional app to find duplicates, empty folders, similar images etc.
fclones - Efficient Duplicate File Finder
go-find-duplicates - Find duplicate files (photos, videos, music, documents) on your computer, portable hard drives etc.
dupeguru - Find duplicate files
rsync - An open source utility that provides fast incremental file transfer. It also has useful features for backup and restore operations among many other use cases.