movie-parser
pytorch_nsfw_model
movie-parser | pytorch_nsfw_model | |
---|---|---|
2 | 1 | |
71 | 50 | |
- | - | |
0.0 | 10.0 | |
about 2 years ago | about 5 years ago | |
TypeScript | Jupyter Notebook | |
- | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
movie-parser
-
Hacker News top posts: Feb 8, 2022
Show HN: An AI program to check videos for NSFW content\ (32 comments)
- Show HN: An AI program to check videos for NSFW content
pytorch_nsfw_model
-
Show HN: An AI program to check videos for NSFW content
It's interesting. I've not tested the model on anything too risque, but again, with the well known Baywatch intro as a frame of reference, wide angle group shots of the whole cast in their swimsuits, is fine. A close up of any single cast member in the famous red swimsuit, will invariably trigger the model. Male or female.
In the blog, I suggest this could be the result of an uncultured data set, which is one part of it. Or perhaps the dataset was fine, and this is pushing the hard limit of what ResNet50 can do (the off the shelf model I use for this is a ResNet50 extension).
Some of the anomalous results are amusing. One day, I passed through a video of a female violinist in concert, and the model flagged every close up of her as NSFW! Just those closeups. Wide shots, and closeups of other musicians were absolutely fine.
Again some of that might be down to me (clucky code, very low NSFW threshold). And I suspect the model I used was itself a PoC (https://github.com/emiliantolo/pytorch_nsfw_model). But it does make you wonder how the bigger labs with critical products like Palantir handle doubts like this.
What are some alternatives?
movie-parser-cli
darknet - Convolutional Neural Networks
wingman_jr - This is the official repository (https://github.com/wingman-jr-addon/wingman_jr) for the Wingman Jr. Firefox addon, which filters NSFW images in the browser fully client-side: https://addons.mozilla.org/en-US/firefox/addon/wingman-jr-filter/ Optional DNS-blocking using Cloudflare's 1.1.1.1 for families! Also, check out the blog!
model - The model for filtering NSFW images backing the Wingman Jr. plugin: https://github.com/wingman-jr-addon/wingman_jr