Convert Apple NeuralHash model for CSAM Detection to ONNX.
> You’d have to not only have over 30 hash collisions
That's trivial. If the attacker can get one image onto your device they can get several.
It's very easy to construct preimages for Apple's neural hash function, including fairly good looking ones (e.g. https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issue... )
> collide with another secret hash function
The supposed other 'secret' hash function cannot be secret from the state actors generating the databases.
Also, if it has a similar structure/training, it's not that unlikely that the same images would collide by chance.
> also have a human look at it and agree it’s CP
That's straight-forward: simply use nude or pornographic images which looks like they could be children or ones where without context you can't tell. It's a felony for them to fail to report child porn if they see it in review, and the NCMEC guidance tells people when in doubt to report.
Besides, once someone else has looked at your pictures your privacy has been violated.
Hacker News Search
Appwrite - The Open Source Firebase alternative introduces iOS support . Appwrite is an open source backend server that helps you build native iOS applications much faster with realtime APIs for authentication, databases, files storage, cloud functions and much more!
Frage zur drohenden Chatkontrolle
1 project | reddit.com/r/de_EDV | 13 May 2022
Apple’s CSAM troubles may be back, as EU plans a law requiring detection
2 projects | reddit.com/r/apple | 11 May 2022
OnlyFans accused of conspiring to blacklist rivals
1 project | reddit.com/r/privacy | 22 Feb 2022
Apple removed CSAM Detection from their Child Safety website
1 project | reddit.com/r/apple | 15 Dec 2021
Apple Removes All References to Controversial CSAM Scanning Feature from Its
1 project | news.ycombinator.com | 15 Dec 2021