Apple’s recent announcement of its new CSAM tool has been marred with controversy. However, almost a year before Apple announced its CSAM initiative, the company’s anti-fraud chief in an iMessage thread said that “we are the greatest platform for distributing child porn, etc.”
Eric Friedman, Apple’s anti-fraud chief, made these comments in an iMessage thread. In the iMessage thread, Friedman talks about how the “spotlight at Facebook etc is all on trust and safety,” while they perform poorly regarding privacy. Apple’s priorities are the inverse, meaning it focuses on privacy than on trust and safety. And this is exactly why Friedman believes that Apple offers the “greatest platform for distributing child porn.”
The internal conversation does indicate that Apple realized the CSAM problem was too big to ignore. Interestingly, there’s no mention in the conversation about how Friedman knows iPhones and iCloud Photo sharing is frequently used to share CSAM content. Was the company already testing its NeuralHash algorithm to find CSAM content stored in iCloud Photos?
There have been a lot of controversies since Apple announced its CSAM initiative. The company has developed a NeuralHash algorithm that would do on-device scanning of all the photos uploaded on iCloud Photos and hash match them against a known database of child abuse photos. Many iPhone users feel this will be a breach of their privacy and are not happy with this. Others believe law enforcement agencies and governments worldwide could use Apple’s tool for their own benefit.
As a part of the CSAM initiative, Apple will also scan all images that a child receives in the Messages app to ensure they are not sexually explicit and then warn them about it.
[Via The Verge]