Apple's Scanning Your Photos (Here's What That Means)
If you’re up on tech news you’ve probably heard about some changes Apple’s making in the next major version of their iPhone and iPad operating systems. In iOS 15 you can stream movies and TV over a Facetime call, memojis got some new looks, and oh yeah, minor detail, Apple will be checking every photo on your phone to make sure it’s not child pornography.
That’s right. Apple is introducing three powerful new features to cut down on the proliferation of Child Sexual Abuse Materials, or CSAM. First, anyone who uses Siri or Search to find CSAM-related topics will instead be sent links for help dealing with their creepy issues. Pretty mild, seems like a good idea. Secondly, children on family accounts won’t be able to see sexually explicit images in the Messages app anymore unless they’re willing to send a notification to their parents that they’re viewing those kinds of images. It’s been a minute since I was in my teens, but I think it’s safe to say most kids will not be opening those texts. And finally, all images which are sent to iCloud will be checked against a big list of known CSAM and marked if they’re found to be potential CSAM. If a user has lots of CSAM then Apple will be able to review their photos, shut down their iCloud account, and alert the authorities.
Don't Miss
Although these all seem like good ideas, the last one has made a lot of news recently and is kicking up quite a controversy in the tech space. No one wants to see CSAM spread online, but privacy advocates and cryptographers are warning that Apple is opening a Pandora's box by snooping through encrypted files. Child protection watchdogs are praising Apple’s moves against internet predators and saying the privacy claims are overblown hand-wringing in the face of a crisis. Everyone else is lining up on sides, choosing between cautious nerds and righteous parents. It’s starting to feel like no one even noticed the whole “streaming TV to Facetime” feature! Come on! They worked hard on that!
Apple
Reading through the coverage of the CSAM controversy it’s hard to tell what’s hyperbole and what’s real. On the one hand, Apple did receive an official warning from Congress that they needed to do more to prevent their services being used for CSAM—and it’s clear that they were doing almost nothing before. On the other hand, despite a lot more noise about being the “Privacy” company, Apple has ceded more and more ground to governments over the last couple of years at the expense of their users’ privacy. Who’s right? Am I helping pedophiles by keeping my dinner photos personal, or is this whole feature just a slippery slope to help surveillance states? Can I trust Apple not to overstep their role as child protectors or am I naively handing over any semblance of privacy for a PR stunt that won’t change anything? Should I be worried about the new memojis, because their hair options are getting out of control or will it be cool to see myself with a blue mohawk??
These are hard questions to answer, and if you really wanted to know what’s up you’d need a person to read through long technical briefs and understand hard topics like elliptic curve cryptography for you. You’d need someone who’s worked with Apple before because a lot of their lingo is … uh, let’s be polite and say unique. And on top of all that you’d want someone devastatingly handsome who has a true way with words. Luckily for you, that someone is me! (Link not included to my devastating handsomeness because I need you to focus on my words and not my body.)
I’m going to give you a better idea of what Apple can and can’t do with this new tech without trying to tell you to be for or against it. At the end of this you’ll understand better what’s going on, be able to spot people who are just tossing around buzzwords, and most importantly of all you will not have to look at diagrams like this:
What exactly is Apple up to?
If you read much about this topic and you’re not a complete geek like me, you’ve probably been convinced that Apple is scanning all of your photos, that they have a backdoor into your phone to read all of your texts, and they’re going to start calling the cops on you because you texted your dealer a pretty good 420 joke last April. Smartphones as we know them are over. It’s time to burn it all down and start over with primitive things, like the wheel and Nokia.
Well, I’m here to tell you that you can calm down about having an Apple employee looking at your spicy texts and hidden photos—it’s not going to happen. The way Apple has set up their CSAM tech on your phone is sophisticated, smart, and will not let their employees see your dank memes.
The recent controversy is mostly around the CSAM detection that could send people’s info to law enforcement, so let’s focus on that. When you hear that pictures on your phone will be reviewed to see if they’re CSAM, you might naturally believe that means that Apple can see all of the pictures on your phone. But actually they can’t!
When you upload a photo to iCloud your phone will also send a doubly encrypted copy of the photo to Apple. In addition to this doubly encrypted copy of your photo, it also uploads a voucher. This might sound complicated, but you can understand the double encryption and the voucher as long as you have a working knowledge of Harry Potter and space stuff.
Let’s talk about the doubly-encrypted copy of your photo that Apple receives. The two layers of encryption are different, so we’re going to treat them differently. The first layer is like your own personal Voldemort. Between Apple and every user’s photos is that user’s Voldemort. They can’t get to your photo because, and I hate to sound like a broken record here, your Voldemort is in the way. Each user has their own Voldemort and if Apple defeats a user’s Voldemort on one picture, he’s gone on all of their other photos as well.
Warner Bros.
The second layer of encryption is an airlock behind Voldemort. I’ll explain the airlock first because it’s a bit easier. Basically, if your phone found this image’s ID on its List of Terrible Pictures then Apple will have a security badge that can open the airlock. So that means only if it matches known CSAM material will Apple be able to open the second layer of encryption.
The voucher is essentially like an object in the Harry Potter universe. Like a shoe, or a scarf or the burner phone Argus Filch has to use. If your phone checked this picture and found it on the List of Terrible Pictures, then it will turn the voucher into a Horcrux. So, if you’re uploading known child abuse material the voucher will help Apple get past the first layer of encryption, Voldemort. If you’re uploading any other kind of picture (like one of dinner or of the vibrator you’re not telling your husband about) then the voucher won’t help Apple get past the first layer of encryption at all.
Harry Potter fans may now be wondering how many Horcruxes each Voldemort has, and the insane answer is thirty. Can you imagine how many more books we’d have if Voldemort had 30 Horcruxes?? The seventh book would have been 60 movies long. In the real world that means you’d have to upload 30 images from the List of Terrible Pictures to iCloud before Apple can even get past the first layer of encryption.
Once Voldemort is defeated and the airlock is open, Apple can actually look at the image that you uploaded. That means to look at any image, Apple would need that user to have uploaded 30 pieces of known CSAM and be trying to look at a picture that is known CSAM.
Some readers may be wondering why even have the second layer of encryption. If you already found 30 pieces of CSAM then you could be pretty sure this is CSAM right? This is to prevent Apple from being able to view any non-offending photos even if they belong to someone as disgusting as a child pornographer. This was to alleviate the concern that Apple might bend to law enforcement and have a backdoor to anyone who uploaded 30 pieces of CSAM. Overall, it’s a system that means Apple is going way out of its way to maintain as much of its users’ privacy as it can.
Why is this a Big Deal?
Finding and reporting CSAM is nothing new, sad as that is to say. All of the major tech companies are doing it—they have tools and teams that are devoted to reviewing posts on their sites and reporting posts that violate US CSAM laws. In fact, all the big tech companies report millions and millions of these every year ... that is, all the big tech companies except Apple. Apple reported around 250 CSAM violations in 2019. Not 250 thousand. 250 total. And that’s a problem, because the scale of Apple’s services means that there are a lot of other CSAM violations on their watch, they just aren’t checking for it.
If Apple was Facebook or Twitter, this would not be hard to solve. They could set up the same tools that those companies have, get rid of the creeps and get back to announcing new camera specs. But Apple does something called end-to-end encryption that makes this all a bit trickier. You’ve probably heard that term before, but without getting too into the weeds, that basically means that everything on everyone’s iPhone is completely private, and even Apple can’t see what’s on there. Kinda like how the people who make bike locks don’t know your combination, Apple can’t decrypt your phone even if they want to. But that puts them in a bind when it comes to figuring out who has CSAM and who doesn’t. How can you police something you can’t see?
Clint Patterson/Unsplash
The solution Apple came up with is very different from the other tech companies. For Facebook or Twitter they can scan photos when you upload one to their sites but for Apple that might already be too late. Those photos could be encrypted. So instead of checking on their servers, Apple has to install something to check for CSAM on your phone. Basically, other sites have a bouncer at their entrance, but Apple has a spy in your house.
For privacy experts, this is a very slippery slope. For years now Apple has declined to help law enforcement break into criminals’ phones, and they had an airtight reason not to: The phones are fully encrypted and therefore Apple can’t get in. Now, privacy experts warn, there will be more and more pressure on Apple to build surveillance tools into their OS in order to help catch people doing bad things … or whatever a local government thinks of as “bad things.” After all, Apple controls the List of Terrible Pictures that all photos are being checked against. Right now it’s to stop child predators, but couldn’t the Chinese government ask Apple to add images of Tank Man and then have a list of people sharing this banned image?
Whether you think it’s a step too far or not enough is up to you, but now you should have a better understanding of what Apple’s new Child Protection is all about. Just to round up some misconceptions that are flying around the internet, here’s a rapid FAQ about some worries people have.
Can Apple See My Photos Now?
No. Even if you’re the scum of the earth and upload tons of CSAM to iCloud, Apple can only see the images that are on its list. Every other photo is safe.
Does Apple Have A Backdoor?
No. A backdoor means a way to bypass encryption completely. Although this term is used even by experts, Apple’s system introduces less of a Backdoor and more of an Inside Man—a system in your phone which uses information from Apple to alert Apple about encrypted data they otherwise couldn’t see.
Why Do Privacy Folks Care? Doesn’t This Just Help Catch Predators?
The privacy issue isn’t with catching people who have CSAM, or even that they think you shouldn’t give up some privacy to catch predators. The issue is that Apple solely controls the List of Terrible Pictures. Right now Apple says the list is only of known CSAM, and comes from a trusted source. But experts point out that that’s not baked into the technology, and the list can change at any moment.
In fact, researchers happened to be working on almost exactly this technology just before Apple announced it and they warn that this method of detection is scary precisely because you can look for whatever you want. Such a system, they warned, “could be easily repurposed for surveillance and censorship. The design wasn’t restricted to a specific category of content.” So it’s putting a lot of faith in Apple that they’ll never use this power for anything but CSAM.