movie-parser-cli VS pytorch_nsfw_model

Compare movie-parser-cli vs pytorch_nsfw_model and see what are their differences.

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
movie-parser-cli pytorch_nsfw_model
1 1
8 50
- -
0.0 10.0
about 2 years ago about 5 years ago
Python Jupyter Notebook
GNU General Public License v3.0 or later -
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

movie-parser-cli

Posts with mentions or reviews of movie-parser-cli. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-02-08.

pytorch_nsfw_model

Posts with mentions or reviews of pytorch_nsfw_model. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-02-08.
  • Show HN: An AI program to check videos for NSFW content
    7 projects | news.ycombinator.com | 8 Feb 2022
    It's interesting. I've not tested the model on anything too risque, but again, with the well known Baywatch intro as a frame of reference, wide angle group shots of the whole cast in their swimsuits, is fine. A close up of any single cast member in the famous red swimsuit, will invariably trigger the model. Male or female.

    In the blog, I suggest this could be the result of an uncultured data set, which is one part of it. Or perhaps the dataset was fine, and this is pushing the hard limit of what ResNet50 can do (the off the shelf model I use for this is a ResNet50 extension).

    Some of the anomalous results are amusing. One day, I passed through a video of a female violinist in concert, and the model flagged every close up of her as NSFW! Just those closeups. Wide shots, and closeups of other musicians were absolutely fine.

    Again some of that might be down to me (clucky code, very low NSFW threshold). And I suspect the model I used was itself a PoC (https://github.com/emiliantolo/pytorch_nsfw_model). But it does make you wonder how the bigger labs with critical products like Palantir handle doubts like this.

What are some alternatives?

When comparing movie-parser-cli and pytorch_nsfw_model you can also consider the following projects:

movie-parser - NWJS wrapper for a wider project.

darknet - Convolutional Neural Networks

model - The model for filtering NSFW images backing the Wingman Jr. plugin: https://github.com/wingman-jr-addon/wingman_jr

wingman_jr - This is the official repository (https://github.com/wingman-jr-addon/wingman_jr) for the Wingman Jr. Firefox addon, which filters NSFW images in the browser fully client-side: https://addons.mozilla.org/en-US/firefox/addon/wingman-jr-filter/ Optional DNS-blocking using Cloudflare's 1.1.1.1 for families! Also, check out the blog!