Our great sponsors
-
open_nsfw
Discontinued Not Suitable for Work (NSFW) classification using deep neural network Caffe models.
-
SurveyJS
Open-Source JSON Form Builder to Create Dynamic Forms Right in Your App. With SurveyJS form UI libraries, you can build and style forms in a fully-integrated drag & drop form builder, render them in your JS app, and store form submission data in any backend, inc. PHP, ASP.NET Core, and Node.js.
AFAIK Discord's NSFW filter is not a perceptual hash nor uses the NCMEC database (although that might indeed be in the pipeline elsewhere) but instead uses a ML classifier (I'm certain it doesn't use perceptual hashes as Discord doesn't have a catalogue of NSFW image hashes to compare against). I've guessed it's either open_nsfw[0] or Google's Cloud Vision since the rest of Discord's infrastructure uses Google Cloud VMs. There's a web demo available of this api[1].
0: https://github.com/yahoo/open_nsfw
1: https://cloud.google.com/vision#section-2
AFAIK Discord's NSFW filter is not a perceptual hash nor uses the NCMEC database (although that might indeed be in the pipeline elsewhere) but instead uses a ML classifier (I'm certain it doesn't use perceptual hashes as Discord doesn't have a catalogue of NSFW image hashes to compare against). I've guessed it's either open_nsfw[0] or Google's Cloud Vision since the rest of Discord's infrastructure uses Google Cloud VMs. There's a web demo available of this api[1].
0: https://github.com/yahoo/open_nsfw
1: https://cloud.google.com/vision#section-2