rmlint
kindfs
rmlint | kindfs | |
---|---|---|
16 | 2 | |
1,778 | 3 | |
- | - | |
5.8 | 4.1 | |
5 months ago | 7 months ago | |
C | Python | |
GNU General Public License v3.0 only | GNU General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
rmlint
-
fdupes: Identify or Delete Duplicate Files
My preferred solution is rmlint [https://github.com/sahib/rmlint] mostly because it also looks at duplicate directories. It produces a bash script instead of deleting anything itself, so you can examine it before running the script it made.
-
ZFS 2.2.0 (RC): Block Cloning merged
After I removed duplicates (with help of https://github.com/sahib/rmlint ), I migrated my photos to an ordinary zpool instead.
-
I decluttered 14,000 digital items within a few hours. Here's how I did it.
For the technically savvy among you there is an excellent open source program called ‘rmlint’ (aka. Remove Lint). It is excellent at finding duplicates and saved me terabytes of space.
-
Looking for Powerful Deduplication software
You don’t say if you are on Windows or Unix. I have used rmlint successfully in the past.
-
the very best anti-duplicate app ?
dupeguru or rmlint
-
deleting duplicates programs?
rmlint, my friend, is the last tool you will ever need for this
- script to remove redundant parent directories
- Is there software that scans for duplicates?
- data hoarding software
-
Go Find Duplicates: blazingly-fast simple-to-use tool to find duplicate files
I use and test assorted duplicate finders regularly.
fdupes is the classic (going way way back) but it's really very slow, not worth using anymore.
The four I know are worth trying these days (depending on data set, hardware, file arrangement and other factors, any one of these might be fastest for a specific use case) are https://github.com/jbruchon/jdupes , https://github.com/pauldreik/rdfind , https://github.com/jvirkki/dupd , https://github.com/sahib/rmlint
Had not encountered fclones before, will give it a try.
kindfs
-
fdupes: Identify or Delete Duplicate Files
fdupes is really nice and fast, but (as far as I remember) it was lacking two features that I needed for my use case, which were 1°/ list duplicate dirs (without listing all of the duplicate sub-contents), and 2°/ being able to identify that all the contents in one dir would be included in another part of the FS (regardless of files/dir structures), which is particularly useful when you have a bigmess/ directory that you progressively sort-out in a clean/ directory. Said differently : fdupes helps to regain space but was not able to help me much to cleanup a messy drive...
This is why I wrote https://github.com/karteum/kindfs (which indexes the fs into an sqlite DB and then enables various ways to process it).
-
Go Find Duplicates: blazingly-fast simple-to-use tool to find duplicate files
FWIW if people are interested, I wrote https://github.com/karteum/kindfs for the purpose of indexing the hard drive, with the following goals
What are some alternatives?
jdupes - A powerful duplicate file finder and an enhanced fork of 'fdupes'.
rdfind - find duplicate files utility
fdupes - FDUPES is a program for identifying or deleting duplicate files residing within specified directories.
dude - Duplicates Detector is a cross-platform GUI utility for finding duplicate files, allowing you to delete or link them to save space. Duplicate files are displayed and processed on two synchronized panels for efficient and convenient operation.
fclones - Efficient Duplicate File Finder
dupeguru - Find duplicate files
czkawka - Multi functional app to find duplicates, empty folders, similar images etc.
mpifileutils - File utilities designed for scalability and performance.