bees
czkawka
bees | czkawka | |
---|---|---|
21 | 361 | |
589 | 17,501 | |
- | - | |
4.0 | 7.7 | |
15 days ago | about 1 month ago | |
C++ | Rust | |
GNU General Public License v3.0 only | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
bees
-
Converted ext4 to btrfs, tried defrag and ran out of space
Btrfs defrag 'will break up the reflinks of COW data' and 'may cause considerable increase of space usage depending on the broken up reflinks'. To try to fix this, I would run bees to try and deduplicate the now duplicate reflinks. It may be worth doing this from e.g. a livedisk though as out of space errors can cause things to break (so don't upgrade packages till you fix this).
-
Introducing Pins: Permanent Nix Binary Storage
Figuring out which paths are needed outside gcroots'ed closures is pretty complicated. If you're using flakes, the main issue is duplicates, so store optimization and bees may help. With channels, once you update a channel you might as well gc everything else.
-
rule
bees
- Should you remove duplicate files?
-
Poke holes in my git-annex + ZFS offline storage system
I felt more confident with the code/developer/docs. The author knows his stuff regarding btrfs. Like, look at this, it's amazing: https://github.com/Zygo/bees/blob/master/docs/btrfs-kernel.md
-
Anyone running Bees? Or deduping data some other way?
I have some time again and wondering if anyone's got Bees, https://github.com/Zygo/bees, running on their Synology.
-
The goal: Use Fedora 37 with Snapper to get a "riceable" Linux desktop that can be rolled back like a time machine (and some comments on why I don't use Silverblue)
Even if NixOS doesn't support sending deduplicating syscalls to the kernel, you could use the Btrfs deduping daemon called bees to slowly save space over time. There might be an equivalent for ZFS, too.
-
Questions Regarding BTRFS, Suspend, and Data Integrity
This isn't much different than ext4. 0 length files can happen after a crash. You can avoid this by mounting with flushoncommit for the future. See here for details.
-
Compression
Maybe BEES can help you to dedup any blocks, not file.
- Is Bees a after-solution to BTRFS defragmentation breaking reflinks ?
czkawka
- Is there software to compress large but similar files?
- Merge three separate partial libraries from external USB drives
-
Tools to deduplicate files
https://github.com/qarmin/czkawka by far the best of anything iv tried
-
fdupes: Identify or Delete Duplicate Files
I've used Czkawka (https://github.com/qarmin/czkawka) because it does Lanczos-based image duplicate detection, which makes it more practical for me.
-
AllDup suddenly taking forever to process/delete selections
Maybe it's a setting you made or the files, not sure. You can try another software czkawka to see if you get better results with it.
-
Is there a file duplicate finder that works with animated jpegxl-gif?
For static images i used https://github.com/qarmin/czkawka and it works well enough. I think. But when i used it on a folder with gifs and their jxl conversions, it shows nothing. SURELY this could not be user error, rrrright?
-
PhotoPrism: Browse Your Life in Pictures
I used to use DupeGuru which has some photo-specific dupe detection where you can fuzzy match image dupes based on content: https://dupeguru.voltaicideas.net/
But I switched over to czkawka, which has a better interface for comparing files, and seems to be a bit faster: https://github.com/qarmin/czkawka
Unfortunately, neither of these are integrated into Photoprism, so you still have to do some file management outside the database before importing.
I also haven't used Photoprism extensively yet (I think it's running on one of my boxes, but I haven't gotten around to setting it up), but I did find that it wasn't really built for file-based libraries. It's a little more heavyweight, but my research shows that Nextcloud Memories might be a better choice for me (it's not the first-party Nextcloud photos app, but another one put together by the community): https://apps.nextcloud.com/apps/memories
-
Please don't post like 20 similar images to the art sites?
Czkawka can do this.
-
I'm amazed how I find anything & why I have so many dupes!
There's always the well-respected tool, Czkawka. Or, of the CLI is your thing, jdupes is a good option.
- I saw a post regarding crate to delete similar files
What are some alternatives?
dduper - Fast block-level out-of-band BTRFS deduplication tool.
dupeguru - Find duplicate files
duperemove - Tools for deduping file systems
jdupes - A powerful duplicate file finder and an enhanced fork of 'fdupes'.
btrbk - Tool for creating snapshots and remote backups of btrfs subvolumes
fdupes - FDUPES is a program for identifying or deleting duplicate files residing within specified directories.
yarn-deduplicate - Deduplication tool for yarn.lock files
AntiDupl - A program to search similar and defect pictures on the disk
PhotoPrism - AI-Powered Photos App for the Decentralized Web 🌈💎✨
snap-sync - Use snapper snapshots to backup to external drive
darktable - darktable is an open source photography workflow application and raw developer