Apple Confirms It Already Scans iCloud Mail for CSAM Content

BY Rajesh Pandey

Published 23 Aug 2021

apple logo top stories

In an acknowledgment that’s bound to raise some eyebrows, Apple has confirmed to 9to5Mac that it scans iCloud Mail for CSAM content. Apple will start scanning iCloud Photos for CSAM content with the release of iOS 15 later this year.

That announcement from the company has already led to many controversies. Apple now acknowledging that it already scans iCloud Mail for CSAM content is only going to make matters worse and make a lot of people feel that their privacy has been violated. Apple on its part has published an FAQ to clear all doubts surrounding its CSAM initiative.

Just like the batterygate saga, Apple never hid it from users that it was scanning iCloud Mail for CSAM content, but then, it never explicitly highlighted it as well. 9to5Mac found Apple mentioning its CSAM scanning in an archived version of its child safety page:

Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. We have developed robust protections at all levels of our software platform and throughout our supply chain. As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation.

Jane Horvath, Apple’s chief privacy officer, confirmed at a tech conference in 2020 itself that the company scans iCloud Mail for CSAM content. If Apple discovers CSAM content in iCloud Mail, it disables that account after reaching a certain threshold. As confirmed by Apple, it has been scanning iCloud Mail for CSAM attachments since 2019.

Not many iPhone or iPad users are likely using iCloud Mail as their default mail provider. However, this revelation will definitely shake their trust in Apple’s products and services. While Apple wants to cut down on the sharing of CSAM content through its products, this NeuralHash algorithm for detecting CSAM content can be used by law enforcement agencies and governments worldwide for their own benefit.

[Via 9to5Mac]