Apple slams the brakes on plans to scan user images for child abuse content

Apple has paused plans to scan devices for child abuse and exploitation material after the tool prompted concern among users and privacy groups.  

Announced last month, the new safety features were intended for inclusion in iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. The first was a feature for monitoring the Messages application, with client-side machine learning implemented to scan and alert when sexually explicit images are sent, requiring input from the user of whether or not they want to view the material.

“As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it,” the company explained.

The second batch of changes impacted Siri and Search, with updates included to provide additional information for parents and children to warn them when they stumbled into “unsafe” situations, as well as to “intervene” if a search for Child Sexual Abuse Material (CSAM) was performed by a user.

The third was a CSAM-scanning tool, touted as a means to “protect children from predators who use communication tools to recruit and exploit them.”

According to the iPhone and iPad maker, the tool would use cryptography “to help limit the spread of CSAM online” while also catering to user privacy. Images would not be scanned in the cloud, rather, on-device matching would be performed in which images would be compared against hashes linked to known CSAM images.

“CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos,” the company said. “This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time.”

In a technical paper (.PDF) describing the tool, Apple said:

“CSAM Detection enables Apple to accurately identify and report iCloud users who store known CSAM in their iCloud Photos accounts. Apple servers flag accounts exceeding a threshold number of images that match a known database of CSAM image hashes so that Apple can provide relevant information to the National Center for Missing and Exploited Children (NCMEC). This process is secure, and is expressly designed to preserve user privacy.”

However, the scanner gained controversy online, prompting criticism from privacy advocates and cryptography experts.

Associate Professor at the Johns Hopkins Information Security Institute and cryptography expert Matthew Green said the implementation of cryptography to scan for images containing specific hashes could become “a key ingredient in adding surveillance to encrypted messaging systems.” 

While created with good intentions, such a tool could become a powerful weapon in the wrong hands, such as those of authoritarian governments and dictatorships. 

The Electronic Frontier Foundation also slammed the plans and launched a petition to put pressure on Apple to backtrack. At the time of writing, the plea has over 27,000 signatures. Fight for the Future and OpenMedia also launched similar petitions. 

On September 3, Apple said the rollout has been halted in order to take “additional time” to analyze the tools and their potential future impact. 

“Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material,” Apple said. “Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Green said it was a positive move on Apple’s part to take the time to consider the rollout. The EFF said it was “pleased” with Apple’s decision, but added that listening is not enough — the tech giant should “drop its plans to put a backdoor into its encryption entirely.”

“The features Apple announced a month ago, intending to help protect children, would create an infrastructure that is all too easy to redirect to greater surveillance and censorship,” the digital rights group says. “These features would create an enormous danger to iPhone users’ privacy and security, offering authoritarian governments a new mass surveillance system to spy on citizens.”

Previous and related coverage

Have a tip? Get in touch securely via WhatsApp | Signal at +447713 025 499, or over at Keybase: charlie0

Access the original article
Subscribe
Don't miss the best news ! Subscribe to our free newsletter :