rmlint
zfs
rmlint | zfs | |
---|---|---|
16 | 3 | |
1,778 | 0 | |
- | - | |
5.8 | 7.6 | |
5 months ago | 7 days ago | |
C | C | |
GNU General Public License v3.0 only | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
rmlint
-
fdupes: Identify or Delete Duplicate Files
My preferred solution is rmlint [https://github.com/sahib/rmlint] mostly because it also looks at duplicate directories. It produces a bash script instead of deleting anything itself, so you can examine it before running the script it made.
-
ZFS 2.2.0 (RC): Block Cloning merged
After I removed duplicates (with help of https://github.com/sahib/rmlint ), I migrated my photos to an ordinary zpool instead.
-
I decluttered 14,000 digital items within a few hours. Here's how I did it.
For the technically savvy among you there is an excellent open source program called ‘rmlint’ (aka. Remove Lint). It is excellent at finding duplicates and saved me terabytes of space.
-
Looking for Powerful Deduplication software
You don’t say if you are on Windows or Unix. I have used rmlint successfully in the past.
-
the very best anti-duplicate app ?
dupeguru or rmlint
-
deleting duplicates programs?
rmlint, my friend, is the last tool you will ever need for this
- script to remove redundant parent directories
- Is there software that scans for duplicates?
- data hoarding software
-
Go Find Duplicates: blazingly-fast simple-to-use tool to find duplicate files
I use and test assorted duplicate finders regularly.
fdupes is the classic (going way way back) but it's really very slow, not worth using anymore.
The four I know are worth trying these days (depending on data set, hardware, file arrangement and other factors, any one of these might be fastest for a specific use case) are https://github.com/jbruchon/jdupes , https://github.com/pauldreik/rdfind , https://github.com/jvirkki/dupd , https://github.com/sahib/rmlint
Had not encountered fclones before, will give it a try.
zfs
-
ZFS 2.2.0 (RC): Block Cloning merged
Not in production, but using ZoL on my personal workstations. https://zfsonlinux.org/
Some discussion: https://www.reddit.com/r/NixOS/comments/ops0n0/big_shoutout_...
-
zfs special device
You could clone this branch, build it (you don't need to install it) and then do something like ./zdb -L -Q foo/bar or zdb -L -Q foo/ if you want the whole pool. (Obviously, if you're walking all the metadata, even with a special vdev, that can take a bit.)
-
That's a lot of metadata...
That seems reasonable to me. Maybe a different ordering would be better, though, and I need to make sure I'm not missing any subtleties...
What are some alternatives?
jdupes - A powerful duplicate file finder and an enhanced fork of 'fdupes'.
qatlib
rdfind - find duplicate files utility
reflink-snapshot - CLI tool for Managing Reflink based Snapshots
fdupes - FDUPES is a program for identifying or deleting duplicate files residing within specified directories.
duperemove - Tools for deduping file systems
fclones - Efficient Duplicate File Finder
zfs - OpenZFS on Linux and FreeBSD
dupeguru - Find duplicate files
zfs-localpv - Dynamically provision Stateful Persistent Node-Local Volumes & Filesystems for Kubernetes that is integrated with a backend ZFS data storage stack.
czkawka - Multi functional app to find duplicates, empty folders, similar images etc.
zfs-autosnapshot - Automatically snapshot your zfs filesystem, and remove (garbage collect) stale snapshots after a while