Our great sponsors
-
SurveyJS
Open-Source JSON Form Builder to Create Dynamic Forms Right in Your App. With SurveyJS form UI libraries, you can build and style forms in a fully-integrated drag & drop form builder, render them in your JS app, and store form submission data in any backend, inc. PHP, ASP.NET Core, and Node.js.
> You’d have to not only have over 30 hash collisions
That's trivial. If the attacker can get one image onto your device they can get several.
It's very easy to construct preimages for Apple's neural hash function, including fairly good looking ones (e.g. https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issue... )
> collide with another secret hash function
The supposed other 'secret' hash function cannot be secret from the state actors generating the databases.
Also, if it has a similar structure/training, it's not that unlikely that the same images would collide by chance.
> also have a human look at it and agree it’s CP
That's straight-forward: simply use nude or pornographic images which looks like they could be children or ones where without context you can't tell. It's a felony for them to fail to report child porn if they see it in review, and the NCMEC guidance tells people when in doubt to report.
Besides, once someone else has looked at your pictures your privacy has been violated.
Related posts
- Legit app in Google Play turns malicious and sends mic recordings every 15 minutes
- Daily General Discussion - October 27, 2022
- How did apple train its cp filtering algorithm?
- [Request] A way to remove Apple’s new NeuralHash ( iCloud CSAM scanner )
- Google AI flags parents' accounts for potential abuse over kid's photos