Over 90 Civil Rights Groups Ask Apple to Abandon CSAM Plans

BY Sanuj Bhatia

Published 19 Aug 2021

tim cook ios 15 privacy video

Ever since Apple announced CSAM detection for iOS, iPadOS, and macOS earlier this month, it’s been a topic of debate. Not only security experts but even Apple’s own employees are calling it out. Now, more than 90 civil rights groups have written an open letter to Apple, asking them to track back its CSAM plans.

The groups believe the same — CSAM detection system can be exploited for other use as well. Signatories to the letter include the American Civil Liberties Union (ACLU), the Canadian Civil Liberties Association, Australia’s Digital Rights Watch, the UK’s Liberty, and the global Privacy International.

The letter starts by highlighting CSAM capabilities, saying even though the “capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”

“Once this capability is built into Apple products, the company and its competitors will face enormous pressure — and potentially legal requirements — from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable,” it continues.

“Those images may be of human rights abuses, political protests, images companies have tagged as ‘terrorist’ or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them,” says the letter.

In addition, the letter says “pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance, and persecution on a global basis.”

Moreover, it says that it could put children at risk.

“The system Apple has developed assumes that the “parent” and “child” accounts involved actually belong to an adult who is the parent of a child, and that those individuals have a healthy relationship. This may not always be the case; an abusive adult may be the organizer of the account, and the consequences of parental notification could threaten the child’s safety and wellbeing. LGBTQ+ youths on family accounts with unsympathetic parents are particularly at risk.”

In the end, the letter says that even though it respects Apple’s efforts of reducing child abuse, the company should stand by the privacy values it has created.

“We support efforts to protect children and stand firmly against the proliferation of CSAM. But the changes that Apple has announced put children and its other users at risk both now and in the future. We urge Apple to abandon those changes and to reaffirm the company’s commitment to protecting its users with end-to-end encryption. We also urge Apple to more regularly consult with civil society groups, and with vulnerable communities who may be disproportionately impacted by changes to its products and services.”

Apple’s child-safety features have been a matter of debate ever since its announcement. Some say it’s fine for the company to check child abuse in the photos while some say it’s a breach of their privacy. How do you feel about Apple searching the iCloud Photo Library for CSAM? Do you think this is a breach of your privacy? Drop a comment and let us know your thoughts!