Apple recently ran into problems with its new feature that scans iCloud photos for CSAM. A non-profit watchdog says that Apple has done very little to protect children from adult content on the App Store. The report further adds that the company is not ready to accept the responsibility.
The Campaign for Accountability (CfA) claims that Apple is not able to protect children from explicit content. This is the same group that complained about Apple removing App Store games at the directive of the Chinese government. The Tech Transparency Project (TPP) report claims that Apple has not implemented even basic measures to protect children from adult material on App Store.
Michelle Kuppersmith, CfA Executive Director, says, “Apple claims that it maintains a tight grip over App Store creators to protect consumers from harmful content.” The executive further adds that “but it hasn’t even put up the most obvious safeguard to keep underage users safe.”
To prove its point, the organization created a test account for a fictitious 14-year old. Surprisingly they were able to download apps related to pornography, dating, and gambling. What’s worst is that there was no warning or anything of that sort. The organization questions how Apple could allow 14-year old access to such content. On a related note it is not clear whether the organization used manual toggle that helps in gate keeping inappropriate content.
The report shows how the organization downloaded 75 apps meant for 17+ from the app store. Most of the apps had a very basic child verification system that meant very little. On the brighter side, many gambling apps asked for government-issued identification. However, some skipped the same.
App Store in Crisis Mode
Apple is trying hard to pitch App Store as a safe and secure marketplace for apps distribution. In the recent past, the App Store has run into regulatory hurdles. Authorities are investigating Apple for mandating its payment system for apps. That apart scam apps are looming large and raking in millions of dollars.
[via Tech Transparency]