clock menu more-arrow no yes mobile

Filed under:

The FBI just unlocked Syed Farook's iPhone without Apple's help. Here's why that matters.

After weeks of the FBI insisting that only Apple could enable it to access encrypted data on the iPhone of San Bernardino terrorism suspect Syed Farook, the law enforcement agency says it doesn't need Apple's help after all.

We don't know how the FBI managed to break the device's encryption after weeks of insisting that it could do so only with Apple's help. The government has said it received assistance from a third party, but it has refused to identify that organization or the techniques that were used.

A couple of weeks ago, it looked like we were headed toward a court ruling that could help define privacy rights for the 21st century. But the government's abrupt change of heart means that courts won't have a chance to rule on whether technology companies can be compelled to help the government break the encryption on their customers' devices. The issue could crop up again in the coming months, as law enforcement agencies seek Apple's help to unlock the iPhones of suspects in other cases.

"From the beginning, we objected to the FBI's demand that Apple build a backdoor into the iPhone because we believed it was wrong and would set a dangerous precedent," Apple said in a Monday night statement. "This case should never have been brought."

The FBI wanted Apple to help guess Farook's passcode

NCAA Basketball Tournament - West Regional - Duke v Oregon
Apple CEO Tim Cook refused to help the FBI hack Farook's iPhone.
Photo by Harry How/Getty Images

Syed Farook and his wife killed 14 people in a shooting spree in San Bernardino, California, in December 2015 before being killed in a shootout with authorities. The government recovered Farook's smartphone, an iPhone 5C. But Farook had enabled the device's encryption technology, preventing the FBI from accessing its contents without knowing the device's four- or six-digit PIN. So the FBI went to court and demanded that Apple help the government unscramble the device's contents.

The encryption chip on the iPhone uses a powerful algorithm called AES to protect customer data. Each iPhone has a unique number called an encryption key that is needed to scramble or unscramble the data on the iPhone. This key is 256 bits long — that is, a string of 256 1s and 0s — which means there a trillion trillion trillion trillion trillion trillion possible values for an iPhone's encryption key.

Apple doesn't keep copies of these encryption keys. And if you wanted to crack the iPhone's encryption by "brute force" — guessing until you find the right one — it would take many lifetimes, even if every computer on the planet were working on it.

So then why are we having a debate at all? The reason is that the weakest link in the iPhone's security isn't the encryption itself but rather the passcode people use to unlock their iPhones. The encryption chip on the iPhone refuses to function until the correct passcode is entered. And by default, the passcode on an iPhone is only four or six digits long.

So the passcode to unlock the encryption chip only has 10,000 or 1 million possible values on most iPhones (if you're extra paranoid, you can enable alphanumeric passcodes, which have many more possible values). If you're trying to crack iPhone encryption, it's a lot faster to try to guess the user's passcode than the underlying encryption key.

People have built robots to use "brute force" on smartphone passcodes — punching in each of the possible values one at a time until they find the correct one. It doesn't take that long for a robot like this to try every possible passcode.

Apple has added two features to the iPhone to help prevent this kind of hack. First, if the user guesses a password wrong several times, the iPhone will introduce a delay of up to an hour before it will accept additional guesses. Second, the user can optionally enable a self-destruct feature that, after 10 bad guesses, will delete information needed to unscramble the encrypted data.

And this is where the FBI sought Apple's help. The FBI wasn't asking Apple to directly unscramble the data on the iPhone — something Apple couldn't have done if it wanted to. Rather, it demanded that Apple disable delays between passcode guesses, disable the self-destruct feature, and allow passcodes to be entered electronically over a wifi network or using the iPhone's lightning port.

Taken together, these measures would have allowed the FBI to guess Farook's passcode much more quickly — and without worrying about triggering the phone's auto-wipe function.

The FBI figured out how to access the iPhone without Apple's help

A couple of weeks ago, it seemed like we were headed for a legal showdown that could help to define Americans' privacy rights in the 21st century. Apple argued that complying with the FBI's order would set a bad precedent that would undermine every smartphone user's privacy rights. The FBI countered that Apple's stance was hampering efforts to fight terrorism and other crimes. A magistrate judge initially ordered Apple to comply, but the company was appealing that.

Then last week, on the eve of a pivotal hearing in the case, the FBI asked for the proceedings to be halted, saying it had learned of a new technique that might allow it to gain access. A week later, the FBI told the court that it had recovered the data on Farook's iPhone and no longer needed Apple's help.

The FBI has provided few details about how it accessed Farook's iPhone — saying only that it did so with the aid of an unnamed third party. An Israeli newspaper reported that the FBI was working with an Israeli company called Cellebrite to hack the iPhone, but anonymous law enforcement officials told USA Today that these reports were inaccurate.

Experts have discussed a number of possible strategies for accessing the device. For example, one theoretically possible but probably not practical attack on the iPhone's hardware, known as decapping, involves performing "microscopic surgery on the wiring of silicon chips" to extract the secret encryption key buried in the device.

Another possibility, known as a replay attack, could defeat the device's limit on passcode guessing by making a copy of the device's memory, making a few guesses (which increments the iPhone's guess counter) and then reversing the counter by rolling back the device's memory to its state before those guesses were made. However, FBI Director James Comey was asked about this method and said that it doesn't work.

The most popular theory is that the FBI accessed Farook's iPhone by exploiting a previously unknown security vulnerability in the iPhone's software. These vulnerabilities are a valuable commodity in the hacking and intelligence worlds — the National Security Agency is believed to stockpile them to use against high-value targets. So the FBI could have received assistance from the NSA, from private security firms, or from an independent security researcher who had discovered a new technique.

The FBI's changing stance could hurt it in future cases

FBI Director, Apple General Counsel Testify Before House Encryption Hearing
FBI Director James Comey has called on Apple to help law enforcement break into locked iPhones.
Drew Angerer/Getty Images

The FBI had argued that only Apple could help it unlock Farook's iPhone. That was important because the law the FBI invoked, called the All Writs Act, requires the FBI to make reasonable self-help efforts before forcing a company like Apple to help its investigation. For weeks, the FBI had claimed that it had exhausted other methods and was turning to Apple as a last resort.

Perhaps the FBI was telling the truth and really didn't know of any alternatives at the time. Regardless, Apple is sure to point out the FBI's shifting stance next time the government tries to force Apple to help hack a customer's iPhone.

And Apple's win in this case is sure to stiffen the spine of Apple CEO Tim Cook. The San Bernardino terrorism case was in many ways a PR nightmare for Apple, forcing the company to effectively defend the privacy rights of one of the most notorious mass murderers in recent years. By standing up to the FBI in this case, and emerging unscathed, Apple will be stronger in future confrontations.

We don't know if the feds will try to force Apple to hack its users again

The big question is what happens to the many other iPhones that various law enforcement officials would like Apple to help unlock. Manhattan District Attorney Cyrus Vance has said he has 175 iPhones he would like Apple to help him crack — and there are presumably hundreds of others across the country.

It's not clear if Vance will be able to get help from the same third party that helped the FBI — or if this assistance will be limited to a handful of high-value targets like the San Bernardino shooter. It's possible that Apple will soon wind up fighting the same legal battle over a different iPhone.

Apple would like to prevent this by building an iPhone whose security features are so powerful that even Apple can't help break them. But it might prove impossible to build a smartphone capable of withstanding attacks from a sophisticated adversary such as the FBI.

The fact that most smartphones are secured by a short numeric passcode instead of a long alphanumeric password places an inherent limit on how secure smartphones can be — hackers merely need to find a way to bypass passcode-guessing limits in order to "brute force" these devices.

Moreover, the complexity of modern smartphone software makes it extremely difficult to eliminate the kind of security vulnerabilities that may have allowed the FBI to hack into Farook's iPhone. There's a thriving underground market for security flaws, and a sufficiently determined and sophisticated adversary might always manage to find a way to break into encrypted devices.

iPhone encryption still matters — even if the FBI can sometimes defeat it

The fact that the FBI managed to defeat the encryption on Farook's iPhone might seem like bad news for privacy-conscious iPhone users. But there's also good news here for privacy advocates.

The FBI's experience suggests that a few sophisticated organizations — like the FBI and the NSA — are able to break into the iPhones of high-value targets. However, iPhone encryption may still provide robust protection against local law enforcement, private criminals, and many foreign governments. And the fact that it's possible, if difficult, to break iPhone encryption may make the courts less willing to intervene and force Apple to help break into encrypted devices.

Most smartphones today have encryption enable by default — if you have to enter a four- or six-digit PIN to unlock your iPhone or Android device, then there's a good chance your device's contents are scrambled. If you want to give yourself added protection against the FBI, you can enable alphanumeric passcodes, which are far more difficult to break.