wingman_jr VS pytorch_nsfw_model

Compare wingman_jr vs pytorch_nsfw_model and see what are their differences.

wingman_jr

This is the official repository (https://github.com/wingman-jr-addon/wingman_jr) for the Wingman Jr. Firefox addon, which filters NSFW images in the browser fully client-side: https://addons.mozilla.org/en-US/firefox/addon/wingman-jr-filter/ Optional DNS-blocking using Cloudflare's 1.1.1.1 for families! Also, check out the blog! (by wingman-jr-addon)

pytorch_nsfw_model

Pytorch model for NSFW classification with usage example (by emiliantolo)
SurveyJS - Open-Source JSON Form Builder to Create Dynamic Forms Right in Your App
With SurveyJS form UI libraries, you can build and style forms in a fully-integrated drag & drop form builder, render them in your JS app, and store form submission data in any backend, inc. PHP, ASP.NET Core, and Node.js.
surveyjs.io
featured
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
wingman_jr pytorch_nsfw_model
6 1
33 46
- -
6.2 10.0
3 months ago about 5 years ago
JavaScript Jupyter Notebook
GNU General Public License v3.0 or later -
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

wingman_jr

Posts with mentions or reviews of wingman_jr. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-11-18.

pytorch_nsfw_model

Posts with mentions or reviews of pytorch_nsfw_model. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-02-08.
  • Show HN: An AI program to check videos for NSFW content
    7 projects | news.ycombinator.com | 8 Feb 2022
    It's interesting. I've not tested the model on anything too risque, but again, with the well known Baywatch intro as a frame of reference, wide angle group shots of the whole cast in their swimsuits, is fine. A close up of any single cast member in the famous red swimsuit, will invariably trigger the model. Male or female.

    In the blog, I suggest this could be the result of an uncultured data set, which is one part of it. Or perhaps the dataset was fine, and this is pushing the hard limit of what ResNet50 can do (the off the shelf model I use for this is a ResNet50 extension).

    Some of the anomalous results are amusing. One day, I passed through a video of a female violinist in concert, and the model flagged every close up of her as NSFW! Just those closeups. Wide shots, and closeups of other musicians were absolutely fine.

    Again some of that might be down to me (clucky code, very low NSFW threshold). And I suspect the model I used was itself a PoC (https://github.com/emiliantolo/pytorch_nsfw_model). But it does make you wonder how the bigger labs with critical products like Palantir handle doubts like this.

What are some alternatives?

When comparing wingman_jr and pytorch_nsfw_model you can also consider the following projects:

movie-parser - NWJS wrapper for a wider project.

darknet - Convolutional Neural Networks

model - The model for filtering NSFW images backing the Wingman Jr. plugin: https://github.com/wingman-jr-addon/wingman_jr

movie-parser-cli

nsfw-filter - A free, open source, and privacy-focused browser extension to block “not safe for work” content built using TypeScript and TensorFlow.js.