kindfs
fdupes
kindfs | fdupes | |
---|---|---|
2 | 17 | |
3 | 2,370 | |
- | - | |
4.1 | 2.3 | |
7 months ago | 13 days ago | |
Python | C | |
GNU General Public License v3.0 only | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
kindfs
-
fdupes: Identify or Delete Duplicate Files
fdupes is really nice and fast, but (as far as I remember) it was lacking two features that I needed for my use case, which were 1°/ list duplicate dirs (without listing all of the duplicate sub-contents), and 2°/ being able to identify that all the contents in one dir would be included in another part of the FS (regardless of files/dir structures), which is particularly useful when you have a bigmess/ directory that you progressively sort-out in a clean/ directory. Said differently : fdupes helps to regain space but was not able to help me much to cleanup a messy drive...
This is why I wrote https://github.com/karteum/kindfs (which indexes the fs into an sqlite DB and then enables various ways to process it).
-
Go Find Duplicates: blazingly-fast simple-to-use tool to find duplicate files
FWIW if people are interested, I wrote https://github.com/karteum/kindfs for the purpose of indexing the hard drive, with the following goals
fdupes
- Fdupes: Identify or Delete Duplicate Files
- fdupes: Identify or Delete Duplicate Files
-
Removing image duplicates
fdupes is simple and easy to use: https://github.com/adrianlopezroche/fdupes
- Backing Up Data: Tips/Advice for Tons of Unorganized Data and Duplicate Files from Multiple Sources
-
File Deduplication
I recently used [fdupes](https://github.com/adrianlopezroche/fdupes) to figure out duplicate files from my amazon cloud drive / photos migration. Took about 2 days to scour through about 1.5TB worth of day.
-
How would I go about copying around 5TB worth of data, from multiple drives to a singular drive/drives (Shared Pools/Raid)?
I would add the content of your current drives with rysnc to the new big drive. I would then run https://github.com/adrianlopezroche/fdupes To remove duplicate files.
- Ask HN: Tool to find identical file subtrees scattered over disks
-
Which tool do you use to find duplicate files?
jdupes, an optimized fork of the popular fdupes. There's 32-bit and 64-bit Win32 packages of jdupes there on Github.
- Mercredi Tech - 2022-05-11
- Suggestions on how to identify & report on old stale data in file shares?
What are some alternatives?
rmlint - Extremely fast tool to remove duplicates and other lint from your filesystem
rdfind - find duplicate files utility
jdupes - A powerful duplicate file finder and an enhanced fork of 'fdupes'.
dude - Duplicates Detector is a cross-platform GUI utility for finding duplicate files, allowing you to delete or link them to save space. Duplicate files are displayed and processed on two synchronized panels for efficient and convenient operation.
czkawka - Multi functional app to find duplicates, empty folders, similar images etc.
go-find-duplicates - Find duplicate files (photos, videos, music, documents) on your computer, portable hard drives etc.
mpifileutils - File utilities designed for scalability and performance.
dupeguru - Find duplicate files