autoscan
qbittools
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
autoscan
- What it looks like to host a completely automated *arr Suite
- Media added/update push from an *arr
-
Plex Autoscan FOR WINDOWS
Use Cloudbox/autoscan instead. Plex Autoscan is not maintained anymore, you can see the repository is archived.
-
Almost instant library scanning compared to built-in method, also much more lightweight for giant libraries. I'm not the author, I just added Postgres support and want to see it merged!
If you use *arr to manage your downloads try setting up the "Connect" section of it with Plex and only scanning when you import/add new media to save on the periodic scans for no reason, or perhaps check out https://github.com/cloudbox/autoscan as I've heard good things about this as well.
-
Should Plex move away from SQLite?
You should now use this instead: https://github.com/Cloudbox/autoscan
-
I've written a script to allow sonarr/radarr to inform tdarr of new/changed/deleted files
This tool is designed to let sonarr/radarr directly communicate with tdarr, much like autoscan is able to communicate between sonarr/radarr and plex/emby/jellyfin.
-
Plex + Autoscan (and connector) for refreshing metadata (for rclone mount users)
The solution to have plex refresh the metadata after a bazarr subtitle add is to use autoscan. NOT plex_autoscan, but autoscan. This is the newer version. You also need autoscan-adapter, the critical piece of the puzzle. Autoscan by itself lets bazarr/sonarr/etc notify plex to update libraries. The autoscan-adapter helps autoscan be able to actually refresh plex metadata so the new subtitles from bazarr can be found. This is all actually very easy to implement using docker-compose. Refer to the docker-compose setup in the autoscan-adapter github.
- Jellyfin erases my libraries when remote mount is down
-
I have Docker instance with exposed folders which are SMB mounted drives. Actual files are on different device mounted via fstab. If I add movie it will not automatically scan the folder. I have to Scan manually. Any recommendation how to fix this?
I recommand using Autoscan. If you're using the *arr stack to download your medias, it can then automatically trigger a scan. The *arr stack can do this without autoscan for jellyfin, but the advantage of autoscan is that you can plug it to any of your other means of downloading medias by just calling its webhook
- How to refresh a single library via command or API?
qbittools
-
qBittorrent settings for racing?
Until then you can use something like qbittools and run it with re-announce on an interval.
- automatic cross seeding
-
Automatically resume torrent stalled by any kind of error
Might be worth checking this out: https://gitlab.com/AlexKM/qbittools
-
can I limit rtorrent to one download at a time
If you don't want to mess with any of this; then maybe you can set max: 1 downloads per hour on your filter settings. Not as ideal as using a tool like qbittools or qbittorrent-cli, but it should still help with HDD I/O. But then again, rTorrent is not ideal for racing torrents.
-
A Comprehensive Guide on How To Automate your Seedboxing
A category for your *arr downloads. One for for your racing on your /ssd, and another for files to permaseed on /hdd. These Categories then can be managed by other third party tools. One example that I use is called qbittools which has a "mover" function.
-
Best filters nowadays for auto-irssi ?
Other than that you don't need much for TL. I would only suggest that you implement some kind of solution to not hammer your disk I/O by trying to race too many torrents at once, especially if it is an HDD. I guess as an easy solution, you could do General > Max 1-2 downloads per hour. Better yet, if you use qBittorrent, you can use something like qbittorrent-cli or qbittools to limit the amount of concurrent races to 1-2 or whatever you think is fit.
- [Guide]Ultraseedbox AutoDL-irssi with qBittorrent
-
First seedbox - ultra.cc
Tool to inject and re-announce using qBittorrent
-
[Need help]Automatically stop public torrents in qBittorrent
Setup this to run hourly via cronjob and every torrent with whatever tracker you add to it, you can pause it.
- trumped torrents
What are some alternatives?
plex_autoscan - Script to assist sonarr/radarr with plex imports. Will only scan the folder that has been imported, instead of the whole library section.
qbittorrent-nox-static - A bash script which builds a fully static qbittorent-nox binary with current dependencies to use on any Linux OS
Tdarr - Tdarr - Distributed transcode automation using FFmpeg/HandBrake + Audio/Video library analytics + video health checking (Windows, macOS, Linux & Docker)
Tautulli - A Python based monitoring and tracking tool for Plex Media Server.
Cloudbox - Ansible-based solution for rapidly deploying a Docker containerized cloud media server.
rtorrent - rTorrent BitTorrent client
Tautulli-Wiki - Wiki for Tautulli
Kometa - Python script to update metadata information for items in plex as well as automatically build collections and playlists. The Wiki Documentation is linked below.
cloudplow - Automatic rclone remote uploader, with support for multiple remote/folder pairings. UnionFS Cleaner functionality: Deletion of UnionFS whiteout files and their corresponding files on rclone remotes. Automatic remote syncer: Sync between different remotes via a Scaleway server instance, that is created and destroyed at every sync.
PlexTraktSync - A python script that syncs the movies, shows and ratings between trakt and Plex (without needing a PlexPass or Trakt VIP subscription)
plex-agents - FileBot Xattr Metadata Scanners & Plug-ins for Plex