Somewhere in the world — perhaps locked in a vault at Apple headquarters in Cupertino, California — Apple has stored a set of very large, very secret numbers that cryptographers call private keys. These keys give Apple — and only Apple — the ability to update the software on the hundreds of millions of iPhones around the world.
These private keys give Apple a tremendous amount of power. Its iPhones regularly contact Apple's servers for software updates, and users have no realistic way of checking what these software updates contain. That means Apple can reprogram the world's iPhones at any time to do almost anything the company wants.
This is why the FBI's request for Apple to unlock San Bernardino terrorism suspect Syed Farook's iPhone has drawn so much attention. Hardly anyone objects to the FBI having access to Farook's private information. But the power conferred by Apple's private keys could let the company do a lot more than unlock the phones of dead terrorism suspects.
For example, suppose the FBI was trying to gather evidence to prosecute a suspected drug kingpin. He's savvy enough not to say anything incriminating in telephone calls or emails. But the FBI knows he carries an iPhone in his pocket. In principle, Apple could help out the FBI by crafting a malicious software update for the suspect's iPhone — and only the suspect's iPhone — that secretly turned on its microphone, recorded every conversation he had, and transmitted them back to FBI headquarters.
This kind of assistance would be so useful to law enforcement that it's only a matter of time before governments start pressuring technology companies to provide it. We can expect technology companies and civil liberties groups to strenuously object to this kind of request, of course, but it's far from obvious that they'll win the legal and political fight.
And this may be why Apple has chosen to take such a hard line in the dispute over Farook's iPhone. Apple knows that in the future it will face many requests to aid law enforcement by modifying software on its customers' devices. The company believes that complying with these requests would be a betrayal of customers' trust and damage its reputation. So it is rallying customers to its side, hoping that having a big, public fight over the issue will put the FBI on the defensive.
Phone companies have long helped the FBI spy on their customers
The idea of forcing Silicon Valley companies to help the government spy on their customers might seem outlandish, but this kind of thing has become commonplace in the telephone industry. Indeed, the FBI's legal argument against Apple in the San Bernardino case rests heavily on a 1977 Supreme Court decision ordering the New York Telephone Company to help the FBI spy on one of its customers.
The FBI was trying to install a pen register — a device to monitor the numbers dialed from a particular telephone — on two telephone lines at an address that the government believed was being used as an illegal gambling establishment. The FBI had obtained a warrant to install the pen register, but it found that it couldn't do it unobtrusively without access to the phone company's facilities — and the phone company refused to provide access.
So the FBI invoked a 1789 law called the All Writs Act to force the phone company to comply. The case went all the way to the Supreme Court, which ultimately sided with the government, ruling that the phone company had an obligation to help the FBI execute the warrant and spy on its customer.
In 1994, Congress went a step further and passed the Communications Assistance for Law Enforcement Act, which requires telephone companies to design their networks with government surveillance in mind. Thanks to this law, the FBI now has (in the words of a 2007 Wired article) "a sophisticated, point-and-click surveillance system that performs instant wiretaps on almost any communications device."
The CALEA law was limited to publicly regulated telecommunications providers. Apple has argued that Congress's decision to apply CALEA to only those companies was a signal that Congress did not intend for non-utility companies like Apple to be forced to provide the same kind of technical assistance that phone companies have to.
The FBI naturally disagrees. It points out that the Supreme Court's 1977 ruling predates CALEA. The FBI says the same arguments that compelled the New York Telephone Company to help the FBI spy on phone customers should also compel Apple to help the FBI unlock its customers' phones.
Automatic software updates are ruining the dreams of privacy hawks
Many civil libertarians are uncomfortable with the whole idea of forcing private companies to help the government spy on their customers. For the past two decades, privacy advocates have hoped that cryptography would provide a technological solution to the political problem of excessive government surveillance. Modern cryptographic algorithms are widely considered unbreakable, and privacy advocates hoped that ubiquitous cryptography would force the government to stop spying on people.
And for people who are technologically sophisticated and privacy-conscious, that future is here. National Security Agency whistleblower Ed Snowden, for example, used encrypted email to communicate with journalists Laura Poitras and Glenn Greenwald, and as far as we know not even the NSA was able to unscramble his messages.
But truly unbreakable cryptography demands a lot of its users. Cryptographic software is only as secure as the computer it runs on, so users must carefully choose the software they install to ensure it hasn't been compromised.
Unfortunately, 99 percent of users don't have the knowledge or patience to do this. That's why iPhones — like many other modern devices — are hard-coded to trust software updates that are signed by Apple and refuse to run other software updates. As long as Apple is trustworthy, this makes our iPhones more secure, because it makes it harder for hackers to install malware on our iPhones. But it also means iPhone users are completely at Apple's mercy.
San Bernardino is the start of a much bigger fight
Journalists (including me) have written that the FBI is asking Apple to hack into Farook's iPhone. But in a sense, that's not accurate. Hackers exploit flaws in software to gain unauthorized access. But Apple's ability to update software on an iPhone isn't a security flaw — it's just the way the iPhone was designed to work.
And more and more devices are being built to work the same way. I recently got an Amazon Echo — a voice-activated speaker that can do everything from play music to order me a refill of paper towels. Amazon has the ability to automatically push software updates out to the device. If the FBI wanted to spy on my family, it could ask Amazon to modify the Echo's software to record every conversation in our home and send audio files to the J. Edgar Hoover Building.
The same is likely to be true of many video game consoles, internet-connected home security cameras, and baby monitors.
It's only a matter of time before law enforcement agencies start to take advantage of the opportunities created by all these internet-connected devices. And if the FBI wins its legal fight with Apple, it could establish a precedent that opens the door to the use of the All Writs Act for other types of surveillance.
But even if Apple wins in court — or the courts issue a narrow ruling that precludes more ambitious surveillance efforts — this debate isn't going to go away. Law enforcement agencies are accustomed to being able to use every available means to spy on suspects. There's no reason to think they'll be happy with a broad and permanent exception for digital devices.