Fact Check

Did Apple Say It Would Scan iPhones for Child Sexual Abuse Imagery?

The tech giant has previously resisted government and law enforcement calls for greater access to users' private data.

Published Aug. 7, 2021

Updated Sept. 3, 2021
An employee works on smartphones reconditioning, mainly Iphone, at the Largo company headquarters which is a Back Market refurbishing company subcontractor, in Sainte-Luce-sur-Loire, outside Nantes, on January 26, 2021. (Photo by LOIC VENANCE / AFP) (Photo by LOIC VENANCE/AFP via Getty Images) (Loic Venance/AFP via Getty Images)
Image courtesy of Loic Venance/AFP via Getty Images
Claim:
In August 2021, Apple announced plans to begin scanning iPhones in the United States for child sexual abuse material.
Context

In September 2021, after this article was originally published, Apple announced it was postponing the commencement of the child sexual abuse images scanning project, but still intended to implement those plans, albeit after taking time to "collect input and make improvements."

In August 2021, news outlets reported that Apple was considering the implementation of plans to scan iCloud photographs for child sexual abuse imagery. For example, on Aug. 5 the Financial Times published an article with the headline "Apple plans to scan US iPhones for child abuse imagery." The piece stated that:

Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.

Apple detailed its proposed system — known as “neuralMatch” — to some U.S. academics earlier this week, according to two security researchers briefed on the virtual meeting. The plans could be publicised more widely as soon as this week, they said.

The technology website Engadget published a similar article, later that day, with the headline "Apple reportedly plans to begin scanning iPhones in the US for child abuse images."

On Aug. 5, Apple itself confirmed that it was planning to introduce a new system of "on-device matching" between photographs stored on Apple devices, and a library of known child abuse "image hashes" — a king of photographic fingerprinting — in order to report photographs that are highly likely to show child abuse, to law enforcement agencies.

In a statement, Apple said the new system would be rolled out later in 2021, through updates to iOS 15, iPadOS 15, watchOS 8 and macOS Monterey. The company explained:

...New technology in iOS and iPadOS will allow Apple to detect known CSAM [child sexual abuse material] images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.

Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations...

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result.

If the photo in question exceeds a threshold of similarity with known child sexual abuse imagery, as determined algorithmically, it would then be forwarded to Apple itself, where a real human being will interpret the image and decide whether or not to forward it to the National Center for Missing and Exploited Children.

According to Apple, the matching process is so accurate and the algorithmic threshold so high that in any given year, the probability that the algorithm will incorrectly flag an innocuous photo is less than "one in one trillion."

The news reports highlighted above were accurate at the time they were published, in early August 2021, but on Sep. 3, Apple said it was postponing its implementation of the project. In a statement, the company wrote:

Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

Significantly, Apple indicated that it still ultimately intends to implement its plans, even though the timeline for that has changed and the final details might change somewhat. As a result, the claim that Apple had announced its intention to begin scanning U.S. iPhones for child sexual abuse material remained accurate, and we are retaining the rating of "True."

Updates

[Updated] Sep. 3, 2021: On Sep. 3, 2021, Apple announced it was postponing its implementation of the child sexual abuse material scanning project, in order to "collect input and make improvements."

Dan Mac Guill is a former writer for Snopes.