|about 1 month ago||12 days ago|
|MIT License||MIT License|
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
We haven't tracked posts mentioning RainbowTaskbar yet.
Tracking mentions began in Dec 2020.
I'm amazed how I find anything & why I have so many dupes!
4 projects | /r/DataHoarder | 8 Jul 2023
There's always the well-respected tool, Czkawka. Or, of the CLI is your thing, jdupes is a good option.
Anyone know of any good file deduplication tools?
2 projects | /r/sysadmin | 29 Jun 2023
4 projects | news.ycombinator.com | 13 Jun 2023
My research into this many years ago turned out that jdupes was the right / best solution I could find for my usecase.
Though that works fine from a script perspective I'd like some more interactive way of sorting directories etc. Identifying is just the first step, jdupes helps with linking the files (both soft and hard links comes with caveats though!) but that is mostly to save space, not to help in reorganisation.
Any good duplicate file finder for windows?
3 projects | /r/sysadmin | 22 Apr 2023
jdupes is a tuned fork of the well-known fdupes, and has Win32 releases.
FLaNK Stack Weekly 3 April 2023
39 projects | dev.to | 3 Apr 2023
Backing Up Data: Tips/Advice for Tons of Unorganized Data and Duplicate Files from Multiple Sources
4 projects | /r/DataHoarder | 21 Dec 2022
Anyone running Bees? Or deduping data some other way?
4 projects | /r/synology | 15 Dec 2022
If not bees, do you run other programs for deduping? I see jdupes has support for BTRFS, https://github.com/jbruchon/jdupes, and also duperemove, https://github.com/markfasheh/duperemove.
Ask HN: Tool to find identical file subtrees scattered over disks
3 projects | news.ycombinator.com | 11 Nov 2022
Tools to find duplicate files on muliple file servers
2 projects | /r/sysadmin | 4 Oct 2022
Jdupes. There are precompiled 64-bit and 32-bit Win32 packages, and Linux is supported. It's a fork of fdupes, which was Linux/Unix only.
Recommendation Needed. I just received several terabytes of unstructured data. It appears that the user would reate a backup of his files "Backup 1", "Backup 2", etc, and then drag entire copies of his files over. There are thousands of duplicate files. Can someone recommend a good file dedupe app?
2 projects | /r/sysadmin | 2 Sep 2022
jdupes is a powerful and fast command-line tool to find and remove duplicate files. Linux and Windows.
What are some alternatives?
fdupes - FDUPES is a program for identifying or deleting duplicate files residing within specified directories.
dupeguru - Find duplicate files
rmlint - Extremely fast tool to remove duplicates and other lint from your filesystem
rdfind - find duplicate files utility
czkawka - Multi functional app to find duplicates, empty folders, similar images etc.
duperemove - Tools for deduping file systems
fclones - Efficient Duplicate File Finder
phockup - Media sorting tool to organize photos and videos from your camera in folders by year, month and day.
btrfs-progs - Development of userspace BTRFS tools
cdecrypt - Decrypt Wii U NUS content — Forked from: https://code.google.com/archive/p/cdecrypt/
dduper - Fast block-level out-of-band BTRFS deduplication tool.
libpostal - A C library for parsing/normalizing street addresses around the world. Powered by statistical NLP and open geo data.