The Line in the Sand: iOS Encryption

By now, anyone who cares to read this article probably knows the background, but here's the short-short version:

Apple has been compelled by a court order to comply with an FBI demand to circumvent security on the iPhone of one of the San Bernardino shooters. Specifically, the FBI wants Apple to create a custom version of iOS that bypasses data protections, which could be loaded unto the phone to break the passcode and/or encryption. Apple has refused this order, and an open letter by Apple CEO Tim Cook has explained why following this order would be disastrous for computer security and have broad-reaching repercussions.

Let's quickly go over some iOS security nitty-gritty. iOS includes a feature called data protection, which encrypts all data stored on your device. iPhones and iPads have a chipset installed between the flash storage and the rest of the hardware, meaning all stored data can be encrypted and decrypted quickly and securely. Additionally, the decryption algorithm relies on the device's unique identifier (UID), a set of shared secret keys, and the device passcode (meaning that stored data can only be accessed by the device that stored it with the valid passcode). Another feature also allows iOS to completely erase the secret keys in the event that the passcode is entered incorrectly ten times in a row, rendering the phone's data irretrievable. All this means the the filesystem is nearly impossible to brute-force decrypt. This is, of course, by design.

It's been difficult to parse exactly what kinds of things the FBI is asking for, but Tim Cook's open letter cites at least one specific demand:
... the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone's physical possession.
Tim Cook is right to raise the alarm bell. But right off the bat, with this letter freshly making the rounds on the internet, I see arguments springing up riddled with misconceptions. I really feel the need to attack these individually, so here goes...

This isn't About Privacy, it's About Security Integrity
In part, yes, this is a privacy issue, but that isn't what everyone should be talking about. Rather, it's an important security discussion. The FBI is asking Apple to find a way to work around their own security apparatuses, in this case using a custom-loaded OS as the attack vector. 

The problem is that in software development, when you discover a security flaw, you fix it, and you notify the users of that software that it's insecure and that they should update to the patched version. People seem to take it as a given that Apple can decrypt data on the phones that they manufacture, but they can't. Again, that's by design.

If the US government forced Apple to circumvent the encryption on iOS, this would probably become common knowledge among Apple's development teams. Attackers on the outside would know that there was a security hole waiting for them to exploit, or some software at Apple that could undo encryption. Suddenly, there would be a huge impetus to get this knowledge. Apple employees, current or former, could be bribed or coerced into providing information. Someone could try to break into Apple's network to electronically steal the software. Or, most likely, an army of developers at some Russian mafia think-tank could simply beat their heads against the problem until they find a way to duplicate the exploit. 

In any case, once the knowledge of how to break an iPhone's encryption shows up in the wild, what does Apple do? Do they (as any sane developer would do) patch the exploit so that it can't be used for crime, or do they leave it open, because they're obligated to do so by the FBI? This is the real endgame question, and another example that shows there's really no difference between a "backdoor for the good guys" and a "major security vulnerability".

This is Yet More Science Denialism
As I explained my recent article, We Are the Nerds: and You Need to Listen to Us, the current "debate" on encryption reeks of science-denialism and anti-intellectualism. At the end of the article I listed nearly every major computing-security authority in existence and showed how they were all vocally opposed to breaking encryption. Admittedly, Computer Science is what many call a "softer science", since it's composed of a lot of things that humans made up. However, encryption delves deep into mathematics and logistics, areas where things are either possible or not possible. So, when such an enormous plurality of experts call breaking encryption a bad, if not dangerous idea, should we not take that advice to heart?

It seems to me that most, if not all, of the people who have framed the US Government's requests as "reasonable" lack the computing knowledge necessary to actually assess whether or not what they're asking of is, in fact, reasonable. 

The Slippery Slope is Realistic
When encryption advocates say that this case sets a dangerous precedent, it's not a fallacy. The core concept of encryption is currently under attack from multiple governments, multiple vectors (the UK's Investigatory Powers Bill being one of the worst examples). It's not unreasonable to assume that, as soon as one company or software project caves to government demands, that governments will be emboldened to make more demands. The absolute worst case scenario is if a major government demands a backdoor  to TLS/SSL, a cornerstone technology upon which secure e-mail, HTTPS, SSH, and a number of other secure connection technologies are based on. If governments demanded a backdoor to TLS/SSL, and that backdoor was discovered by criminals or terrorists, then the internet as we know it would collapse. Large networks like Facebook, Google, Apple's iCloud, as well as financial institutions and government agencies, depend on transport layer security to operate wide area networks without hackers and criminals constantly messing with their data.

It's Impossible to Stop Terrorists from Using Encryption
One of the most damning arguments against government-mandated backdoors is that they simply won't do what they're designed to do. Multiple sources have indicated that terrorists are already moving away from encryption solutions designed in the US to ones made by companies and individuals far outside the purview of the US. Even if every government in the West were to ban unbreakable encryption (ie: encryption that actually works), they still couldn't stop terrorists from using unbreakable apps made in Russia, China, South America, or elsewhere. 

Meaning, the West will have compromised our entire technology security apparatus and have gained nothing. 

In Conclusion: There is No Debate
Like climate change or vaccines, there's nothing more to discuss here - It is an argument between those who understand the technologies and logistics of computer security, and those who don't understand and would destroy it.

So, what happens next? Apple is defying a court order from a US federal magistrate. It is a gutsy, dangerous move. But they're doing it because they also know everything I've explained above. The outcome will depend on whether regular citizens rally to the side of technology experts, or to their increasingly authoritarian governments.

I hope, for all our sakes, it's the former.


Update 1: I removed some language in the security-not-privacy section which, on review, was not technically correct. I feel the point of the section stands without it but I may revise later.

Update 2: Gizmodo has a great article on this story by Kate Knibbs.


"USA Smash iPhone!" illustration © Jesse Schooff/GeekMan.ca