c-blosc
jdupes
c-blosc | jdupes | |
---|---|---|
1 | 44 | |
959 | 1,681 | |
0.6% | - | |
5.7 | 0.0 | |
about 2 months ago | 7 months ago | |
C | C | |
GNU General Public License v3.0 or later | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
c-blosc
-
WASM compression benchmarks and the cost of missing compression APIs
Related to compressing data before storing on SSD:
Blosc - faster than memcpy()
https://github.com/Blosc/c-blosc
On right circumstances Blosc is so fast that even speed ups reading data from RAM (read less, decompress in L1 and L2 caches)
jdupes
-
File Servers... how are you handling duplicates
I recommend the use of jdupes, a fork of the well-known fdupes, to find duplicate files.
-
fdupes: Identify or Delete Duplicate Files
200 lines of Nim [1] seems to run about 9X faster than the 8000 lines of C in fdupes on a little test dir I have. If you need C, I think jdupes [2] is faster as @TacticalCoder points out a couple of times here. In my testing, `dups` is usually faster than `jdupes`, though.
[1] https://github.com/c-blake/bu/blob/main/dups.nim
[2] https://github.com/jbruchon/jdupes
-
I'm amazed how I find anything & why I have so many dupes!
There's always the well-respected tool, Czkawka. Or, of the CLI is your thing, jdupes is a good option.
- Anyone know of any good file deduplication tools?
-
Johnny Decimal
My research into this many years ago turned out that jdupes was the right / best solution I could find for my usecase.
https://github.com/jbruchon/jdupes
Though that works fine from a script perspective I'd like some more interactive way of sorting directories etc. Identifying is just the first step, jdupes helps with linking the files (both soft and hard links comes with caveats though!) but that is mostly to save space, not to help in reorganisation.
- Jdupes: A powerful duplicate file finder
-
Does jdupes do a 'dry run' if you just specify directory(s) and no other options
I can work it out by looking at https://github.com/jbruchon/jdupes.
-
replace duplicates with hard links - I think jdupes is the answer, or maybe fclones (I have questions)
I have looked at a few alternatives and think jdupes is the one for me. Then I found out it was not multi-threaded so will give it a go but the developer of jdupes recomended fclones (https://github.com/jbruchon/jdupes/issues/186) if you were dealing with large file systems and wanted multi-threading. But as I am using a HD it may not be necessary.
-
De-Duping a file server
jdupes is a fork of the old standby fdupes, but it has a Win32 release as well as supporting POSIX.
-
Any good duplicate file finder for windows?
jdupes is a tuned fork of the well-known fdupes, and has Win32 releases.
What are some alternatives?
lizard - Lizard (formerly LZ5) is an efficient compressor with very fast decompression. It achieves compression ratio that is comparable to zip/zlib and zstd/brotli (at low and medium compression levels) at decompression speed of 1000 MB/s and faster.
fdupes - FDUPES is a program for identifying or deleting duplicate files residing within specified directories.
lexbor - Lexbor is development of an open source HTML Renderer library. https://lexbor.com
dupeguru - Find duplicate files
FPC - FPC - Fast Prefix Coder
rmlint - Extremely fast tool to remove duplicates and other lint from your filesystem
cgif - GIF encoder written in C
rdfind - find duplicate files utility
wyhash - The FASTEST QUALITY hash function, random number generators (PRNG) and hash map.
duperemove - Tools for deduping file systems
czkawka - Multi functional app to find duplicates, empty folders, similar images etc.
fclones - Efficient Duplicate File Finder