Video Transcoding
timeline
Our great sponsors
Video Transcoding | timeline | |
---|---|---|
9 | 5 | |
2,348 | 6 | |
- | - | |
0.0 | 8.9 | |
7 months ago | 8 days ago | |
Ruby | JavaScript | |
MIT License | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Video Transcoding
-
The Deception of “Buying” Digital Movies
I use this project by Don Melton to get a Blu-ray video down to an 8 - 10 GB file size: https://github.com/donmelton/video_transcoding
It uses HandBrake, FFmpeg, MKVToolNix, and MP4v2 with some custom tuned settings and has really good results from my experience.
-
What Are Your Most Used Self Hosted Applications?
I have primarily used Plex and pretty much everything you said is accurate for Plex as well. Limited transcoding based on the machine it is running on. As disc has become cheaper, I have pretty much stopped doing batch transcodes, which is great for the most part. But there are definitely negatives when you want to watch something offline, or remotely. Biggest pain point is subtitles though. Since they aren't ripped as text and then sent to a client, they have to be burned in to the video itself and transcoded on the fly. Which means losing out on 'forced' ones if it can't transcode fast enough.
Plex has definitely started to try and commercialize itself more and offer other stuff, when all I want is access to my own media. So I may look into Jellyfin more soon.
As for batch transcode jobs, I had a system that I was able to set up as essentially a black box. Drop a rip into a folder and out the other side comes a smaller one at a reasonable quality. With forced subs burned right into the actual video. Mostly based on https://github.com/donmelton/video_transcoding
- I know this is a super specific thing to ask, but would anyone that rips their collection to a Plex server care to share your compression settings?
-
BluRay Movie File Size Question
I use Don Melton’s tools to transcode videos to mp4 files. His tools makes use of Handbrake but he has it tuned to produce very small video files of very high quality. You are unlikely to notice the difference when watching the videos.
-
Hit my goal. 100 movies in one year. Done the “old fashioned” way (rip—>encode). Made it with two days to spare. (Plex server built Sept 29, 2020)
Checkout https://github.com/donmelton/video_transcoding. In my experience produces higher quality and smaller files than handbrake alone.
-
Best Handbrake settings for transcoding
When I was ripping my disc collection, I used Don Melton’s library. Don originally started the Safari and Webkit project at Apple but after he retired, he spent some significant time trying to create an easy way of compressing video but resting quality. Great collection of tools in my opinion that leverage handbrake for encoding. Good luck!
-
Best Handbrake settings for 4K Blu-ray?
Check out https://github.com/donmelton/video_transcoding
-
Rplexs Moronic Mondays No Stupid Questions Thread
I've been using Don Melton's Video Transcoding tool for my whole library. I upgraded my NAS to a Synology DS1019+ a few months ago, so now I have the space to store the untranscoded MakeMKV files. I've had issues with playback in my system, so I don't mind manually transcoding.
-
I present to you: The ripper
Check out https://github.com/donmelton/video_transcoding – I use it to turn raw Blu-ray tips from 30+ gigs down to 5-7 with no noticeable loss of quality.
timeline
-
Ask HN: Admittedly Useless Side Projects?
My timeline thing. It gathers all my crap and puts in onto a timeline. It's a more fine-grained version of scrolling to a specific date on my photo stream.
https://github.com/nicbou/timeline
It serves no purpose, but somehow it attracted one contributor.
It's pointless on purpose. It's the thing I work on when I want to forget about work, and build purely for myself.
-
Ask HN: What's your personal backup strategy?
Google Drive as a first line of defence. It's been solid for a really long time.
I also run hourly rsync backups to my home server, and propagate them to a Hetzner file storage server. This is done by my timeline thing [0]. The timeline thing backs up files from multiple devices, but also geolocation, social media posts, and other data I consider valuable. It's extensible, so I can add new inputs/outputs as needed.
Whatever your backup strategy is, consider the following threats:
- Your files are held hostage by ransomware, and the damage spreads to your backup
- Your house is destroyed by fire
- You lose your 2FA device
- You are locked out of your Google/Apple/Microsoft account
- You are incapacitated, and someone needs to take over
I have 4 of those factors covered. I am working on the last point.
[0] https://github.com/nicbou/timeline
-
What Are Your Most Used Self Hosted Applications?
My own timeline thing.
It hosts all of my data plus my personal diary. I update it at least once a day. My photos, backups and geolocation are automatically uploaded to it.
https://github.com/nicbou/timeline
My home server gets a lot of use too. It's mostly my own code, plus Transmission.
https://github.com/nicbou/homeserver
I also have a few lines of code that take my browser's search queries and routes them according to keywords. Browsers do this natively now, but old habits die hard. Every search query goes through it.
-
Ask HN: Who wants to help promote RSS?
I added RSS to my websites, because my timeline thing (https://github.com/nicbou/timeline) uses them to retrieve posts from my websites.
However, I see the death of RSS as the symptom of a larger problem: when platforms get big enough, they restrict access to their data. RSS feeds disappear, but so do other machine-readable endpoints. If it wasn't for GDPR, there would be no way to export that data. GDPR gave us clunky one-time exports, but even those are often incomplete.
The industry has a strong incentive to kill RSS, since the readers can strip the valuable bits (content or data) from the business bits (analytics, monetisation). RSS users are hard to count or monetise.
This is a battle worth fighting, but it's not one you should expect to win.
-
What is your “I don't care if this succeeds” project?
https://github.com/nicbou/timeline
It regroups my personal data, and displays it on a timeline. Sort of like if Google Photos also included reddit posts, personal journal entries, text messages and other slices of life.
I do it both as a way to back up files and photos, and as a way to keep an enhanced journal.
What are some alternatives?
Streamio FFMPEG - Simple yet powerful ruby ffmpeg wrapper for reading metadata and transcoding movies
Som - Parser, code model, navigable browser and VM for the SOM Smalltalk dialect
Tdarr - Tdarr - Distributed transcode automation using FFmpeg/HandBrake + Audio/Video library analytics + video health checking (Windows, macOS, Linux & Docker)
react-qml - Build native, high-performance, cross-platform applications through a React (and/or QML) syntax
HandBrake - HandBrake's main development repository
Simula - A Simula 67 parser written in C++ and Qt
automatic-ripping-machine - Automatic Ripping Machine (ARM) Scripts
RSSHub - 🧡 Everything is RSSible
makemkv-autorip-script - A bash script for automatically ripping movies using MakeMKV, with parallelization for multiple drives.
callibella - Sync your personal calendar to your work calendar, privately 🐒
node-makemkv - Web UI for MakeMKV
worker-planet - Generate a single page (and feed) with content from multiple RSS/Atom sources. Runs on Cloudflare Workers.