Survivors Laud Apple's New Tool To Spot Child Sex Abuse But The Backlash Is Growing

Aug 13, 2021
Originally published on August 13, 2021 4:33 pm

Updated August 13, 2021 at 3:31 PM ET

About a decade ago, a member of Ann's family was arrested for taking sexually abusive photos of her child and distributing them online.

"Imagine the very worst thing that has ever happened to you was recorded and then it was shared repeatedly for other peoples' pleasure," Ann told NPR. She did not want to reveal her full name to preserve her family's privacy.

As with so many crimes involving the internet, the nightmare didn't end with the arrest. Her child's real name was used with the photos that were circulating.

"Ten years later, we still have people trying to find my child, looking for images, wanting new images," she said. "It's a constant, constant battle."

Child-safety groups have for years pressured Apple, the world's largest technology maker, to help stop the spread of abusive images that are taken and shared on its devices. Now, the company is about to act.

In the coming months, Apple will roll out an update to its iOS operating system. It will contain a tool capable of scanning and identifying child pornography on iPhones and other Apple devices. The announcement, made last week, has Ann feeling encouraged.

"I can't think of a family that I know that is not a fan of companies like Apple stepping up and saying, 'Let's help stop kids from being abused,'" she said.

Apple, which has staked its reputation on the security of its devices, has lagged behind some other major tech companies. Last year, Facebook reported more than 20 million images of child sexual abuse on its platforms. Apple reported less than 300.

But the photo-scanning tool, which is one of a number of changes Apple is making to better protect children, has set off an outcry among privacy and security experts. Through open letters and newspaper op-eds, critics have argued that the technology creates a "backdoor" on Apple devices that could be used for more nefarious activities like government surveillance.

Privacy advocates fear tool could be abused

The way Apple's system will work is complicated, but it boils down to this: A database of known child abuse images maintained by the National Center for Missing & Exploited Children has been distilled into encrypted bits of codes that will be stored on Apple devices.

Apple created an automated process to compare that code to photos backed up on the iCloud. The company says there must be 30 matches before it notifies the nonprofit, which works with law enforcement to investigate child sexual abuse.

Facebook, Google, Twitter, Reddit and other companies analyze images uploaded to their platforms for possible abuse, but Apple's tool will scan photos on personal devices, which has set off intense resistance.

Computer programmer and writer John Gruber, who runs the Apple-focused blog Daring Fireball, says he was taken aback by the public reaction.

"Part of it maybe comes down to the fact that the processing is taking place on your devices," he said. "And that may be violating a sense of personal ownership."

Apple officials on Friday told reporters that in the wake of criticism from privacy groups, the company will let human rights organizations audit how its photo-scanning system works to ensure the tool is not being misused.

"This isn't doing some analysis for, did you have a picture of your child in the bathtub? Or, for that matter, did you have a picture of some pornography of any other sort? This is literally only matching on the exact fingerprints of specific known child pornographic images," Federighi told the Wall Street Journal.

Yet more than 7,000 developers and security and privacy experts signed an online petition asking Apple to drop the plan, calling it a backdoor that threatens the privacy of all users of Apple products.

The petition said the new feature sets "a precedent where our personal devices become a radical new tool for invasive surveillance, with little oversight to prevent eventual abuse and unreasonable expansion of the scope of surveillance."

The opposition also includes the head of Facebook's encrypted messaging service WhatsApp and Edward Snowden. Even some Apple employees have raised concerns, prompting a debate inside the company.

India McKinney of the Electronic Frontier Foundation says the technology amounts to "putting a bunch of scanners in a black box onto your phone."

She said Apple has bowed to authoritarian governments before. For example, it sells iPhones in Saudi Arabia without FaceTime because local laws prohibit encrypted calls. So the fear is that Apple will make similar concessions with its photo-scanning technology.

Apple says such a demand would be refused.

"We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future," Apple said in a FAQ about the feature.

The company says its tool is built solely to detect images of child sex abuse. People could choose what amounts to an opt-out simply by not using iCloud as a backup for their photos. And the software will be introduced only in the U.S. for now.

Proliferation of child sex abuse images is 'very overwhelming'

Gruber, who studied Apple's system for scanning photos backed up in the cloud, has concluded that, if Apple sticks with the limits it has promised, it will not compromise user privacy.

"I truly believe that Apple has carved out a very carefully planned position that I think maintains the privacy that people expect from their Apple devices," he said.

Yet that has not stopped critics from questioning whether Apple's "What happens on your iPhone stays on your iPhone" billboards are still accurate.

And if the scanning technology is misused or abused, Gruber said, it could be disastrous for Apple.

Ann, meanwhile, is watching the debate and thinking of her child, who is now a young adult.

Ann said Apple's new measures won't keep those images of her child off the Internet altogether. But since the images are part of the National Center for Missing & Exploited Children's photo database, Apple's system would make it much harder for people to share them.

"I know that my child's images have been identified hundreds of thousands of times, so there's quite a widespread number of them out there," she said.

Each time the National Center for Missing & Exploited Children finds an image, it notifies Ann.

And to this day, Ann said, "It can be very overwhelming."

Editor's note: Apple is among NPR's financial supporters.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

AUDIE CORNISH, HOST:

This next story may not be suitable for all listeners. Apple says its next iPhone and iPad update will help catch child predators through a sophisticated photo matching system. Privacy advocates worry the system could create a backdoor on all Apple devices. NPR's Bobby Allyn takes a closer look. And we also want to note that Apple is among NPR's financial supporters. Here's Bobby.

BOBBY ALLYN, BYLINE: I recently had a conversation with Ann. We're only using Ann's middle name to preserve her family's privacy. About a decade ago, a family member was arrested for taking sexually abusive photos of her child and sharing them online.

ANN: Imagine the very worst thing that has ever happened to you was recorded and then it's shared repeatedly for other people's pleasure.

ALLYN: Their nightmare didn't end with the arrest. Her child's real name was used with the photos that were circulating.

ANN: Ten years later, we still have people trying to find my child, looking for images, wanting new images. It's, you know, a constant, constant battle.

ALLYN: Child safety groups have for years pressured Apple to help people like Ann. Finally, the company has done something. Soon, all iPhones and other Apple devices will be scanned for child pornography in an upcoming update to its iOS operating system.

ANN: I can't think of a family that I know that is not a fan of companies like Apple stepping up and saying, let's help stop kids from being abused.

ALLYN: How it will work is pretty complicated, but it boils down to this. A database of known child abuse images has been distilled down into encrypted bits of code. Apple created an automated process to compare that code to everyone's photos backed up on the cloud. When there's a match, Apple will notify the National Center for Missing and Exploited Children, which works with law enforcement. Longtime tech critic John Gruber has studied Apple's plan and says as long as Apple implements the system as it says it will, it appears secure.

JOHN GRUBER: I truly believe that Apple has carved out a very carefully planned position that I think maintains the privacy that people expect from their Apple devices.

ALLYN: Gruber admits Apple, which has staked its reputation on privacy, has a lot on the line. If the scanning technology is misused, Gruber says it could be disastrous for Apple. Already, resistance is building.

INDIA MCKINNEY: What Apple is doing is they are putting a bunch of scanners in a black box onto your phone.

ALLYN: India McKinney is with the Electronic Frontier Foundation. She is among those pushing back. Some Apple employees are, too, and a group of more than 7,000 developers and security and privacy experts have signed a petition asking Apple to drop the plan. They call it a backdoor that threatens the privacy of all users of Apple products. McKinney worries about where this can lead down the road.

MCKINNEY: How could this idea be misused by abusive governments or abusive spouses or abusive parents?

ALLYN: Critics like McKinney point out that Apple sells iPhones in Saudi Arabia without FaceTime, since local laws prohibit encrypted calls. So the fear is that Apple will make similar concessions with its photo scanning technology. Apple says such a demand would be refused. The company says the tool is built solely to detect images of child sex abuse. People have the ability to opt out by not using iCloud as a backup. Ann says she doesn't think this will keep images of her child off the internet altogether, but it will go a long way in stopping people from seeing them.

ANN: I know that my child's images have been identified hundreds of thousands of times, so there's quite a widespread number of them out there.

ALLYN: And each time one is found by child safety groups, she's notified. She says it's overwhelming.

Bobby Allyn, NPR News, San Francisco. Transcript provided by NPR, Copyright NPR.