EU Law Mandating CSAM Detection Could Spell Trouble for Apple

BY Chandraveer Mathur

Published 11 May 2022

European Union Unsplash

When it announced plans to scan iPhones for child sexual abuse material (CSAM), Apple stirred up controversy. Although its plans were suspended after strong backlash, the European Commission has proposed legislation to “combat child sexual abuse online.” This could force the iPhone maker to walk a tightrope to comply with the law without igniting controversy again.

Why Did Apple Shelve Its CSAM Detection Plans?

In September last year, Apple announced that it is temporarily putting plans to detect CSAM on hold while it takes some time to reconsider its approach. Since then, we haven’t heard anything about the company’s goals, although implementation in some form is undoubtedly on the cards.

Cloud storage providers usually scan for offensive CSAM, and detected instances are reported to law enforcement agencies. Apple intended to approach CSAM scanning with a focus on user privacy. It planned to use an on-device database of CSAM image hashes to ensure your photos didn’t contain any matches. If matches were found, they would be reviewed manually. Further, if the images from your device were identified as CSAM, they would be reported to law enforcement.

Privacy advocates, cybersecurity experts, human rights organizations, and Apple staffers took to social media platforms to explain the glaring issues with the company’s method. Firstly, there is always a possibility of false positives, and it could ruin a person’s reputation if reported to the authorities. Apple addressed this concern and set a minimum threshold of 30 matches. If more than 30 potential CSAM images were detected on your device before the matches are sent for manual review. Additionally, there could be deliberate attempts to defame people by injecting false positives.

Several eminent individuals, including privacy advocate Edward Snowden, highlighted that governments and law enforcement agencies could modify the database to search people’s devices for political posters and other material. Moreover, the same hash-based on-device searches could also be applied to iMessage.

The Tricky Road Ahead

Earlier today, a Politico report suggested that the European Union will announce a new law to regulate Big Tech companies sometime this week. The ruling would require these companies to scan user devices for CSAM. Now that the ruling has been announced, Apple is faced with the tough road of complying without rekindling the controversy.

the European Commission reportedly opines that measures taken by some platforms are “insufficient” to tackle “misuse of online services for the purposes of child sexual abuse.”

“Today, the Commission is proposing new EU legislation to prevent and combat child sexual abuse online. With 85 million pictures and videos depicting child sexual abuse reported worldwide in 2021 alone and many more going unreported, child sexual abuse is pervasive. The COVID-19 pandemic has exacerbated the issue, with the Internet Watch foundation noting a 64 percent increase in reports of confirmed child sexual abuse in 2021 compared to the previous year. The current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children and, in any case, will no longer be possible once the interim solution currently in place expires. Up to 95 percent of all reports of child sexual abuse received in 2020 came from one company, despite clear evidence that the problem does not only exist on one platform.”

“The proposed rules will oblige providers to detect, report, and remove child sexual abuse material on their services. Providers will need to assess and mitigate the risk of misuse of their services, and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards.”

Apple’s challenge lies in ensuring privacy for users while simultaneously creating back doors in encryption protocols for the enforcement agencies. Home Affairs Commissioner Ylva Johansson rightly pointed out, “Abusers hide behind the end-to-end encryption; it’s easy to use but nearly impossible to crack, making it difficult for law enforcement to investigate and prosecute crimes.”

[Via Politico]