Yesterday, Apple announced many features, including Child Safety measures, photo scanning, and iMessage Blur for iPhone. The new features scan content in iCloud to detect Child Sexual Abuse Material (CSAM). If such material is found on childs iPhone, a warning is sent to both the child and parents. The new feature has come under scanner as privacy advocates, including Edward Snowden, have slammed the feature.
Many security experts and privacy advocates have openly criticized Apple’s plan to scan users’ iCloud for photos. They allege that nothing is stopping Apple from launching a mass surveillance campaign on trillions of devices. Ross Anderson, professor of security engineering at the University of Cambridge, says, “It is an appalling idea because it is going to lead to distributed bulk surveillance of . . . our phones and laptops.” Security researchers are also terming the move as a “regressive step for individual privacy.”
No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.
They turned a trillion dollars of devices into iNarcs—*without asking.* https://t.co/wIMWijIjJk
— Edward Snowden (@Snowden) August 6, 2021
Apple Internal Memo Leaked
Digital rights organization EFF has highlighted potential issues with the new feature. The foundation says Apple has created a backdoor.
To say that we are disappointed by Apple’s plans is an understatement. Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again. Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.
An internal memo attempts to address some of the concerns to Apple Staff.
Today marks the official public unveiling of Expanded Protections for Children, and I wanted to take a moment to thank each and every one of you for all of your hard work over the last few years. We would not have reached this milestone without your tireless dedication and resiliency.
Keeping children safe is such an important mission. In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal, Product Marketing and PR. What we announced today is the product of this incredible collaboration, one that delivers tools to protect children, but also maintain Apple’s deep commitment to user privacy.
We’ve seen many positive responses today. We know some people have misunderstandings, and more than a few are worried about the implications, but we will continue to explain and detail the features so people understand what we’ve built. And while a lot of hard work lays ahead to deliver the features in the next few months, I wanted to share this note that we received today from NCMEC. I found it incredibly motivating, and hope that you will as well.
The memo includes a message from the executive director at the National Center for Missing and Exploited Children.
I wanted to share a note of encouragement to say that everyone at NCMEC is SO PROUD of each of you and the incredible decisions you have made in the name of prioritizing child protection.
It’s been invigorating for our entire team to see (and play a small role in) what you unveiled today.
I know it’s been a long day and that many of you probably haven’t slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority.
Our voices will be louder.
Our commitment to lift up kids who have lived through the most unimaginable abuse and victimizations will be stronger.
During these long days and sleepless nights, I hope you take solace in knowing that because of you, many thousands of sexually exploited victimized children will be rescued and will get a chance at healing and the childhood they deserve.”
Thank you for finding a path forward for child protection while preserving privacy.
Apple is yet to issue a public statement to address the backlash. The feature will only scan images uploaded on iCloud and this is something that is a norm with other web services.[via 9to5Mac]