Apple: Reverse-Engineered NeuralHash Algorithm Is Generic, Not the One Used for CSAM

BY Rajesh Pandey

Published 19 Aug 2021

apple logo

Developer Asuhariet Yvgar managed to reverse engineer what he believed was Apple’s NeuralHash algorithm for its CSAM tool. He found the NeuralHash algorithm code hidden in iOS 14.3, from where he was able to extract and rebuilt it in Python and then reverse engineer it. Soon after, other developers were able to create “hash collisions” in the NeuralHash algorithm by getting two non-identical images to have the same hash.

In a statement, however, Apple says that the NeuralHash algorithm found by the developer is a generic version and not the final one that it uses in iCloud Photos for CSAM detection. It also confirmed that after a threshold of hashes match, a non-public algorithm runs on Apple’s servers to confirm the results. There’s even a manual human review after this.

“The NeuralHash algorithm [… is] included as part of the code of the signed operating system [and] security researchers can verify that it behaves as described,” one of Apple’s pieces of documentation reads. Apple also said that after a user passes the 30 match threshold, a second non-public algorithm that runs on Apple’s servers will check the results.

Nicholas Weaver, a senior researcher at UC Berkeley’s International Computer Science Institute, said, “Apple actually designed this system so the hash function doesn’t need to remain secret, as the only thing you can do with ‘non-CSAM that hashes as CSAM’ is annoy Apple’s response team with some garbage images until they implement a filter to eliminate those garbage false positives in their analysis pipeline.”

Apple will implement the NeuralHash algorithm for scanning photos uploaded to iCloud Photos for checking CSAM content with the release of iOS 15 and iPadOS 15 later this year. A lot of iPhone users are angry and disappointed with Apple ever since it announced the CSAM detection tool. Many believe this is a breach of their privacy, while others believe Apple’s CSAM system can eventually be abused by law enforcement agencies and governments across the world.

[Via Motherboard]