ImageNet contains naturally occurring Apple NeuralHash collisions

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
  • AppleNeuralHash2ONNX

    Convert Apple NeuralHash model for CSAM Detection to ONNX.

  • It doesn't seem like it; check out the adversarially constructed images here. They don't look anything like the original despite perfectly matching the NeuralHash: https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issue...

  • neuralhash-collisions

    A catalog of naturally occurring images whose Apple NeuralHash is identical.

  • The sample set is ImageNet, which is a well-known dataset in Computer Vision and is available for download here: https://www.kaggle.com/c/imagenet-object-localization-challe...

    I'd love to see this work extended; if you find additional collisions in the wild please submit a PR to the repo (please do not submit artificially generated adversarial images): https://github.com/roboflow-ai/neuralhash-collisions

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • neural-hash-collider

    Preimage attack against NeuralHash 💣

  • Yes, I can. This is just one possible strategy: there are many others, where different things are done, and where things are done in a different order.

    You use the collider [1] and one of the many scaling attacks ([2] [3] [4], just the ones linked in this thread) to create an image that matches the hash of a reasonably fresh CSAM image currently circulating on the Internet, and resizes to some legal sexual or violent image. Note that knowing such a hash and having such an image are both perfectly legal. Moreover, since the resizing (the creation of the visual derivative) is done on the client, you can tailor your scaling attack to the specific resampling algorithm.

    Eventually, someone will make a CyberTipline report about the actual CSAM image whose hash you used, and the image (being a genuine CSAM image) will make its way into the NCMEC hash database. You will even be able to tell precisely when this happens, since you have the client-side half of the PST database, and you can execute the NeuralHash algorithm.

    You can start circulating the meme before or after this step. Repeat until you have circulated enough photos to make sure that many people in the targeted group have exceeded the threshold.

    Note that the memes will trigger automated CSAM matches, and pass the Apple employee's visual inspection: due to the safety voucher system, Apple will not inspect the full-size images at all, and they will have no way of telling that the NeuralHash is a false positive.

    [1] https://github.com/anishathalye/neural-hash-collider

    [2] https://embracethered.com/blog/posts/2020/husky-ai-image-res...

    [3] https://bdtechtalks.com/2020/08/03/machine-learning-adversar...

    [4] https://graphicdesign.stackexchange.com/questions/106260/ima...

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts