photo-ingest
rmlint
Our great sponsors
photo-ingest | rmlint | |
---|---|---|
1 | 16 | |
7 | 1,776 | |
- | - | |
3.5 | 5.8 | |
9 months ago | 4 months ago | |
PowerShell | C | |
MIT License | GNU General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
photo-ingest
-
very huge amount of photos (40k+) scattered across multiple hard drives and google photos
exiftool is awesome and the author is very responsive and helpful. It does a lot of the heavy lifting in various applications like GeoSetter. It is also very powerful. It is well worth the time and effort to get to know the tool and it's options. I used to use a few different incantations of exiftool by copy pasting them from notes. I got tired of doing that and eventually wrapped it up in some helper scripts I now use routinely on both Windows and Linux. The sources are on GitHub - photo-ingest - in case you're interested.
rmlint
-
fdupes: Identify or Delete Duplicate Files
My preferred solution is rmlint [https://github.com/sahib/rmlint] mostly because it also looks at duplicate directories. It produces a bash script instead of deleting anything itself, so you can examine it before running the script it made.
-
ZFS 2.2.0 (RC): Block Cloning merged
After I removed duplicates (with help of https://github.com/sahib/rmlint ), I migrated my photos to an ordinary zpool instead.
-
I decluttered 14,000 digital items within a few hours. Here's how I did it.
For the technically savvy among you there is an excellent open source program called ‘rmlint’ (aka. Remove Lint). It is excellent at finding duplicates and saved me terabytes of space.
-
Looking for Powerful Deduplication software
You don’t say if you are on Windows or Unix. I have used rmlint successfully in the past.
-
the very best anti-duplicate app ?
dupeguru or rmlint
-
deleting duplicates programs?
rmlint, my friend, is the last tool you will ever need for this
- script to remove redundant parent directories
- Is there software that scans for duplicates?
- data hoarding software
-
Go Find Duplicates: blazingly-fast simple-to-use tool to find duplicate files
I use and test assorted duplicate finders regularly.
fdupes is the classic (going way way back) but it's really very slow, not worth using anymore.
The four I know are worth trying these days (depending on data set, hardware, file arrangement and other factors, any one of these might be fastest for a specific use case) are https://github.com/jbruchon/jdupes , https://github.com/pauldreik/rdfind , https://github.com/jvirkki/dupd , https://github.com/sahib/rmlint
Had not encountered fclones before, will give it a try.
What are some alternatives?
darktable - darktable is an open source photography workflow application and raw developer
jdupes - A powerful duplicate file finder and an enhanced fork of 'fdupes'.
AntiDupl - A program to search similar and defect pictures on the disk
rdfind - find duplicate files utility
fdupes - FDUPES is a program for identifying or deleting duplicate files residing within specified directories.
fclones - Efficient Duplicate File Finder
dupeguru - Find duplicate files
czkawka - Multi functional app to find duplicates, empty folders, similar images etc.
rsync - An open source utility that provides fast incremental file transfer. It also has useful features for backup and restore operations among many other use cases.
libpostal - A C library for parsing/normalizing street addresses around the world. Powered by statistical NLP and open geo data.
kindfs - Index filesystem into a database, then easily make queries e.g. to find duplicates files/dirs, or mount the index with FUSE.
exiftool - ExifTool meta information reader/writer