bees
jdupes
bees | jdupes | |
---|---|---|
21 | 44 | |
589 | 1,681 | |
- | - | |
4.0 | 0.0 | |
15 days ago | 7 months ago | |
C++ | C | |
GNU General Public License v3.0 only | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
bees
-
Converted ext4 to btrfs, tried defrag and ran out of space
Btrfs defrag 'will break up the reflinks of COW data' and 'may cause considerable increase of space usage depending on the broken up reflinks'. To try to fix this, I would run bees to try and deduplicate the now duplicate reflinks. It may be worth doing this from e.g. a livedisk though as out of space errors can cause things to break (so don't upgrade packages till you fix this).
-
Introducing Pins: Permanent Nix Binary Storage
Figuring out which paths are needed outside gcroots'ed closures is pretty complicated. If you're using flakes, the main issue is duplicates, so store optimization and bees may help. With channels, once you update a channel you might as well gc everything else.
-
rule
bees
- Should you remove duplicate files?
-
Poke holes in my git-annex + ZFS offline storage system
I felt more confident with the code/developer/docs. The author knows his stuff regarding btrfs. Like, look at this, it's amazing: https://github.com/Zygo/bees/blob/master/docs/btrfs-kernel.md
-
Anyone running Bees? Or deduping data some other way?
I have some time again and wondering if anyone's got Bees, https://github.com/Zygo/bees, running on their Synology.
-
The goal: Use Fedora 37 with Snapper to get a "riceable" Linux desktop that can be rolled back like a time machine (and some comments on why I don't use Silverblue)
Even if NixOS doesn't support sending deduplicating syscalls to the kernel, you could use the Btrfs deduping daemon called bees to slowly save space over time. There might be an equivalent for ZFS, too.
-
Questions Regarding BTRFS, Suspend, and Data Integrity
This isn't much different than ext4. 0 length files can happen after a crash. You can avoid this by mounting with flushoncommit for the future. See here for details.
-
Compression
Maybe BEES can help you to dedup any blocks, not file.
- Is Bees a after-solution to BTRFS defragmentation breaking reflinks ?
jdupes
-
File Servers... how are you handling duplicates
I recommend the use of jdupes, a fork of the well-known fdupes, to find duplicate files.
-
fdupes: Identify or Delete Duplicate Files
200 lines of Nim [1] seems to run about 9X faster than the 8000 lines of C in fdupes on a little test dir I have. If you need C, I think jdupes [2] is faster as @TacticalCoder points out a couple of times here. In my testing, `dups` is usually faster than `jdupes`, though.
[1] https://github.com/c-blake/bu/blob/main/dups.nim
[2] https://github.com/jbruchon/jdupes
-
I'm amazed how I find anything & why I have so many dupes!
There's always the well-respected tool, Czkawka. Or, of the CLI is your thing, jdupes is a good option.
- Anyone know of any good file deduplication tools?
-
Johnny Decimal
My research into this many years ago turned out that jdupes was the right / best solution I could find for my usecase.
https://github.com/jbruchon/jdupes
Though that works fine from a script perspective I'd like some more interactive way of sorting directories etc. Identifying is just the first step, jdupes helps with linking the files (both soft and hard links comes with caveats though!) but that is mostly to save space, not to help in reorganisation.
- Jdupes: A powerful duplicate file finder
-
Does jdupes do a 'dry run' if you just specify directory(s) and no other options
I can work it out by looking at https://github.com/jbruchon/jdupes.
-
replace duplicates with hard links - I think jdupes is the answer, or maybe fclones (I have questions)
I have looked at a few alternatives and think jdupes is the one for me. Then I found out it was not multi-threaded so will give it a go but the developer of jdupes recomended fclones (https://github.com/jbruchon/jdupes/issues/186) if you were dealing with large file systems and wanted multi-threading. But as I am using a HD it may not be necessary.
-
De-Duping a file server
jdupes is a fork of the old standby fdupes, but it has a Win32 release as well as supporting POSIX.
-
Any good duplicate file finder for windows?
jdupes is a tuned fork of the well-known fdupes, and has Win32 releases.
What are some alternatives?
dduper - Fast block-level out-of-band BTRFS deduplication tool.
fdupes - FDUPES is a program for identifying or deleting duplicate files residing within specified directories.
duperemove - Tools for deduping file systems
dupeguru - Find duplicate files
btrbk - Tool for creating snapshots and remote backups of btrfs subvolumes
rmlint - Extremely fast tool to remove duplicates and other lint from your filesystem
yarn-deduplicate - Deduplication tool for yarn.lock files
rdfind - find duplicate files utility
snap-sync - Use snapper snapshots to backup to external drive
czkawka - Multi functional app to find duplicates, empty folders, similar images etc.
dedupe - :id: A python library for accurate and scalable fuzzy matching, record deduplication and entity-resolution.