Hash collision in Apple NeuralHash model

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • AppleNeuralHash2ONNX

    Convert Apple NeuralHash model for CSAM Detection to ONNX.

  • I think I explained it adequately. An attacker gets some hashes likely to be in the database. The attacker modified legal pornography of young looking people, perhaps closeups of genitalia, to match the child porn hashes.

    There are plenty of examples of US persons being charged for lawful pornography which a prosecutor accused of being child porn, I linked to one such example-- where the accused was rescued only by the testimony of the actress, and where the prosecution had expert witnessess testifying that the images of an adult were images of a child.

    The attacker in this case would know the origin of the images and could point to them. The victim would have no idea where they came from.

    I have searched the case law, and can find no case where the NCMEC provided an image for comparison. Are you aware of any?

    You also seem to have continued to move the goalpost. Being merely accused of possessing child porn would be extremely damaging to a person. The fact that they might escape conviction by a jury after years of legal ordeal, incarceration, and having their reputation ruined, is not that much consolation.

    > The idea that new images of nudity could be caught in it is because

    This is an absurd claim. I have demonstrated (https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issue...) that it is possible to alter existing images to match an arbitrary neuralhash resulting in a perfectly looking image. This point is not a matter for dispute, anymore.

    I could easily do so with a nude or pornographic image, I only chose to use SFW pictures out of good taste.

    My interest in apple's scheme arouse out of their malicious use of cryptography to shield their actions from accountability. I didn't even hear about the imessage nudie scanner until some time after. You can be certain no such confusion applies here.

  • neural-hash-collider

    Preimage attack against NeuralHash 💣

  • "Apple said that there is a one in one trillion chance of a false positive"

    https://techcrunch.com/2021/08/05/apple-icloud-photos-scanni...

    Versus:

    https://blog.roboflow.com/nerualhash-collision/

    Along with:

    https://github.com/anishathalye/neural-hash-collider

    Demonstrates blatant incompetence on the part of Apple's design. At this point, trust is broken. That's the end of credibility regarding any other claim made about the security or base function of the system.

    As for a simple example of one of many potential attack vectors (search HN for "neuralhash" and read the comments for countless examples), here is a basic attack:

    As Apple's hash algorithm is entirely broken and useless, a malicious actor can easily craft a NSFW, but otherwise legal image, that has a hash matching one in the CSAM database, that is visually ambiguous in that context (for example, a close-up). Targeted attacks could be launched on unsuspecting victims as simply as sending them an image.

    If their app is configured to auto-save to iCloud - as some are by default, and as many are configurable to be - the image will then get automatically flagged as a match, and may well potentially pass a human review - because the image may indeed look just like CP - and an innocent person, who may not even be aware of the presence of the image in their iCloud library at this point, may get visited by the police.

    That in itself is bad enough.

    However it gets worse - do bear in mind that "the police" is not an equivalent concept across nations, yet iPhones are ubiquitously used across the world. In certain countries, a "visit from the police" over such automatically-flagged content could well result in presumption of guilt to the point of immediately administered punishment and a destroyed innocent life.

    In China for example, conviction rates are routinely 99.99% - being accused of a crime there is equivalent to being convicted, and that's just one major country containing well over 100 million iphone users.

    Apple is demonstrating not only gross incompetence in this design of this "system", but vagrant disregard for human rights, along with utterly destroying their own long-cultivated pro-provicy stance.

    It's a disaster for them and their customers in every possible way it could be.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • hardened_malloc

    Hardened allocator designed for modern systems. It has integration into Android's Bionic libc and can be used externally with musl and glibc as a dynamic library for use on other Linux-based platforms. It will gain more portability / integration over time.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • Interest in Switching to iPhone Drops Among Android Users Ahead of iPhone 13 Launch, Survey Shows

    2 projects | /r/iphone | 31 Aug 2021
  • Apple Just Gave Millions of Users a Reason to Quit Their iPhones

    3 projects | news.ycombinator.com | 22 Aug 2021
  • Apple Just Gave Millions Of Users A Reason To Quit Their iPhones

    3 projects | /r/technology | 22 Aug 2021
  • WhatsApp forces Pegasus spyware maker to share its secret code

    2 projects | news.ycombinator.com | 2 Mar 2024
  • EncroChat

    1 project | news.ycombinator.com | 16 Feb 2024