If you were paying attention during the last lecture, then you know Apple announced two proposals to scan iPhones for photos involving sexual abuse and exploitation of children. First: before photos are uploaded to iCloud, they will be scanned on the phone to look for matches to known illegal images. Second: when an iPhone is used with a child account, photos sent or received on iMessage will be scanned for nudity.
Apple’s proposals were unexpected and they have ignited a firestorm of controversy. This is an overview of what has caused people to criticize the plans, followed by some speculation about why Apple acted now. Take your time, stretch out a little, we have a lot of ground to cover.
Apple’s plans offer valuable protection and preserve privacy
In our hair-trigger world, too many people were quick to condemn Apple without acknowledging that the company is trying to do something good and decent and valuable.
The scale of abuse that happens to kids online and the impact on their families is unimaginable. Don’t research child porn – you’ll learn awful things. Just trust me: it’s much worse than you have any idea. Apple is carefully and thoughtfully addressing an important issue.
Apple has provided quite a lot of detail about its plans in white papers and presentations to experts and journalists. As outlined, the plans are exactly what security experts would expect from Apple: respectful of your privacy with elaborate safeguards to prevent false positives and overreaching, with no data exposed inappropriately to Apple employees or third parties.
There may be good reasons to criticize Apple’s plans but start from the premise that its motives are pure.
Is it a problem that the scans occur on your phone instead of in the cloud?
Apple has been crowing for years that it is the best company for privacy. Apple hammers on the point over and over: anything that happens on your phone is private. Apple is not tracking your location, not studying your email, not making notes on your web browsing.
Some of the fiercest criticism of Apple’s child safety proposals comes from people who see a fundamental breach of that promise of privacy. The child safety scans will be done on your phone – comparing images to a database of illegal images that Apple has stored on your phone; identifying nudity in children’s text messages using your phone’s processing power.
That phone in your pocket is now a very powerful computer. For the first time phones can do this kind of processing, and they have enough storage space for a database representing millions of illegal photos.
Apple thought that doing the work on the phone would make it seem more private. Instead it infuriates some people. I’ve heard people complaining that it makes them uncomfortable that the ICMEC database would be on the phone because it has something to do with child porn and that’s icky. It’s just millions of lines of alpha-numeric gibberish but it makes some people feel kind of unclean to think of it on their phones.
After all, privacy zealots say, it’s your phone, right? Critics argue that you should have an even stronger expectation of privacy on the phone than with a cloud service. Apple will be performing a law enforcement function on your phone – a radical change in Apple’s privacy posture.
Personally I don’t feel the strength of this argument. I don’t see much difference between scanning photos on your phone or scanning them when they arrive online as part of an iCloud backup. But this is the argument that resonates most with many people.
Worth keeping in mind: although Apple claims it will only scan photos on the phone, and it will only do it when photos are chosen to be backed up to iCloud, that’s just a policy decision – nothing makes that work better technically. It could be changed without consent, just like this child porn system is being implemented without consent.
Scanning for child porn is fine but it opens the door for Apple to be coerced into invasive behavior in the future
Putting aside the issue of where the scans occur, Apple’s detailed photo-scanning plans are swell. They’re thorough and protective and appropriate. Kudos to Apple. They’re a little late to this effort – other companies have been scanning photos and trying to control online distribution of child porn for a long time – but full credit to Apple for stepping up.
But in the end, absolutely everything comes down to these sentences in one of Apple’s white papers:
Apple will refuse all requests to add non-CSAM images to the perceptual CSAM hash database. . . . Apple will also refuse all requests to instruct human reviewers to file reports for anything other than CSAM materials for accounts that exceed the match threshold.
Apple will refuse, dammit! Apple pinky-swears that it will only scan for child porn using the methods it has outlined. That’s it! If any country dares to ask Apple to do something different, Apple will block and resist and never yield an inch.
And it probably means it. Today, in the United States, Apple is determined and steadfast and true.
But here’s the problem.
Apple has proven that it can compare photos on the phone to a database of photos that it installs with an iOS update.
China could prepare a database of photos of Uighur leaders and tell Apple that under Chinese law and as a matter of coerced economic necessity, iPhones must scan for those photos and alert the Chinese government. Would Apple resist?
You know what could be made into a convenient database in the US? Commercial movies and TV shows. The entertainment industry is obsessed with copyright violations and piracy. I fully expect them to start beating on Apple to scan phones for pirated movies, which in their mind is just as evil as child porn. Would Apple resist?
Personally, I am even more alarmed by Apple’s proposal to scan iMessage photos for nudity. That is the first time to my knowledge that Apple is scanning photos to identify the content of the images.
That’s what Google Photos does so well – Google can identify a dog and a cat and your mother and the Eiffel Tower.
Apple is saying: “We can scan photos for boobies and wieners but we will absolutely not scan photos in Saudi Arabia when the government hands us a fistful of pictures of human rights activists and tells us to look for them. We will not scan photos for images that the Chinese government asks us to find, no no no, even if the Chinese courts insist that those scans are required under Chinese law. Trust us. We will refuse!”
Today, in the US, Apple means those things sincerely, I’m sure. But we live in a complicated world and what Apple has done is proven that it has the technical ability to do far more than search for child porn.
You know how you can tell this is possible? It’s already been done. Five years ago Facebook, Twitter, Google, and Microsoft started using similar technology to identify extremist content such as terrorist recruitment videos or violent terrorist imagery. It was used to remove al-Qaeda videos.
Great! Al-Quaeda is awful and dangerous. But all those companies were sticking their toe over the line – they had technology to scan for child porn and oh look at the other thing it can do!
The Electronic Frontier Foundation says: “All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.”
Jonathan Mayer is an assistant professor of computer science and public affairs at Princeton University who helped design the system that Apple’s CSAM-detection tech is actually based on. He was part of a team of researchers that recently declared the system dangerous and warned that it should not be adopted by any company without further safeguards. From an article about the group’s findings:
Most alarmingly, researchers noted that it could be easily co-opted by a government or other powerful entity, which might repurpose its surveillance tech to look for other kinds of content. “Our system could easily be repurposed for surveillance and censorship,” writes Mayer and his research partner, Anunay Kulshrestha, in an op-ed in the Washington Post. “The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching data base, and the person using that service would be none the wiser.”
Can we trust Apple to stand firm and never cross that line when it’s under pressure from countries around the world?
Why did Apple act now?
Apple has been taking strong positions on privacy. Five years ago Apple refused to unlock the phone of the San Bernardino shooter because it just wouldn’t be able to sleep at night if it violated its rules about privacy. Apple wrote an open letter to customers about the San Bernardino shooter phone and said:
For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.
But privacy drives some groups crazy.
Child porn activists have been fiercely critical of Apple for not reporting the presence of illegal photos on iPhones – because Apple wasn’t checking. The National Center for Missing and Exploited Children reported that Apple reported 265 cases to the authorities last year, compared with Facebook’s 20.3 million. Sounds terrible, doesn’t it? That’s what privacy looks like.
Apple has been pressured at Congressional hearings about this issue, and members of Congress have been threatening regulation to require it to start scanning for child porn. There are similar laws or regulations scheduled to take effect or under consideration in the United Kingdom and European Union.
One thing companies do when lawmakers are threatening to make them do things they don’t want to be legally obligated to do is to take on some lighter-lift stuff that its lobbyists can point to while trying to water down the legislation. Maybe Apple is doing something reasonable today to head off unreasonable laws later.
Although it’s difficult to defend an absolute stance on privacy, Apple is finding out it’s even harder to defend a nuanced stand – and perhaps harder to resist expansions to the technology going forward.
“We’re only going to look for child porn!”
“Well, we’re only going to scan for child porn and terrorists, but that’s all.”
“Okay, we’re going to scan everyone’s photos and messages for child porn, and terrorists, and LGBTQ images in Saudi Arabia, and the Spanish Inquisition, but that’s where we draw the line.”
Nobody expects the Spanish Inquisition. And nobody expects Apple to cave into every demand for invasions of our privacy.
But Apple has opened a back door for law enforcement on your phone, and even a thoroughly documented, carefully thought-out, and narrowly-scoped back door is still a back door.