I did say that this blog would avoid getting into political issues and stick to practical concerns. But the events of the past week with Apple and the FBI are pretty disturbing and I want to talk about why.
First, nothing about the technical matters involved in the conversation (with one exception) is anything that you or I or any other private individual can do anything about. It’s all taking place in the rarified air of law enforcement vs public policy, from those who believe they know what is good for us, and have the power to change how others are allowed to access our personal data. We can lobby our elected officials and hope somebody can get past the fear mongering enough to listen.
Next, there is one technical thing you can do to protect your personal device: choose a strong passcode. I’m going to assume you already use a passcode, but the default four or six digit number isn’t going to stand up to a brute force attempt to break it. Make it longer. Make it numbers and letters if you can stand to (it’s a real pain to enter, I know.) Do it not because you “have something to hide” but because you will always have something you don’t want shared and it’s not possible to know in advance what, when, or how that might come about. Make protecting your personal data a normal activity. The longer and more complicated your passcode, the more effort it will take to guess. As long as we continue to use passcodes this will be true, and the goalpost is always moving.
Now, on with the real subject of this post. Get comfortable, this will take a while.
Folks who have followed this issue know that Apple (along with other companies) have routinely responded to search warrants and other official requests for customer data. From a practical standpoint, they have to. But they also have been re-designing parts of their systems to make it less feasible. (It’s important to note that recovering data from a locked device is not the same as unlocking it.) Not only is it now more difficult for people outside Apple to access private data on iOS devices, it’s also more difficult for Apple itself to do.
Discussion of the two current court cases, with detail on what is possible to recover from a locked device for various iOS versions
“No, Apple Has Not Unlocked 70 iPhones For Law Enforcement”
Court order requiring Apple to comply
The reason for this has many parts, and one very important part is of course to make their product more attractive to customers. Apple is in the business of selling equipment, that’s what they do. When it came out that we tinfoil hats hadn’t just been making up stuff we suspected the NSA was snooping on (and they far exceeded our speculations) suddenly US companies had a huge problem: international customers. Foreign organizations, businesses and governments alike, were none too keen to have confirmed in excruciating detail the extent that the US government was spying on everyone. If US companies want to continue to sell products, they have to be able to convince security-conscious customers that they aren’t just a lapdog for the NSA.
When somebody says “Apple is only doing this because of marketing” consider what that means. People don’t buy your product without “marketing.” Unless you have somehow managed a sweet exclusive deal that can never be taken away, your company depends on marketing for its continued existence. And your marketing and the products it promotes have to appeal to customers. All over the world, more and more people are saying “You know, I don’t much like the idea that the US Government could walk in and see my stuff.”
Strong cryptography is not just for protecting business interests. Livelihoods, and sometimes lives, also depend on the ability to keep private things private. For years people have claimed that products built in China are untrustworthy because the Chinese government can force their makers to provide a way in to protected data. It’s better to buy from trusted companies in enlightened countries where that won’t happen. Who is left on that list?
And what about terrorism? Of course, the things terrorists have done are awful. Nobody is contesting that. But opening up everyone to risk so governments have the ability to sneak up and overhear a potential terror plot doesn’t change how threats are discovered. The intelligence agencies already have more data than they are able to handle, it’s the process that’s broken and not that they suddenly have nothing to look at. There have been multiple cases where pronouncements of “This would have never happened without encryption” have been quickly followed by the discovery that perpetrators were using basic non-encrypted communications that were not intercepted or correctly analyzed. “Collect more data because we can” is not a rational proposal to improve the intelligence process, even if the abuse of privacy could be constitutionally justified.
There is no such thing as a magic key that only authorized users are permitted to use and all others will be kept out forever. If there’s a way in, someone will find it. Nothing is perfect, a defect will eventually be found, maybe even those authorized users will slip up and open the door. Also, state actors are hardly trustworthy when they say these powers will only be used to fight the most egregious terror threats and everybody else will be left alone. Even if they could prevent backdoors from being used without authorization, their own histories belie their claims.
The dangers of having “secret” entry enforced only by policy to not give out the key
“TSA Doesn’t Care That Its Luggage Locks Have Been Hacked”
Intelligence agencies claim encryption is the reason they can’t identify terror plots, when the far larger problem is that mass surveillance generates vast quantities of data they don’t have the ability to use effectively
“5 Myths Regarding the Paris Terror Attacks”
Officials investigating the San Bernardino attack report the terrorists used encrypted communication, but the Senate briefing said they didn’t
“Clueless Press Being Played To Suggest Encryption Played A Role In San Bernardino Attacks”
What the expanded “Sneak-and-Peek” secret investigatory powers of the Patriot Act, claimed to be necessary because of terrorism, are actually being used for
“Surprise! Controversial Patriot Act power now overwhelmingly used in drug investigations”
TSA ordered searches of cars valet parked at airports
“TSA Is Making Airport Valets Search Your Trunk”
What is being asked of Apple in this case?
Not to unlock the phone, because everyone agrees that’s not technically possible. Not to provide data recoverable from the locked device by Apple in their own labs, which they could do for previous iOS versions but not now. What the court order actually says is they must create a special version of the operating system that prevents data from being wiped after 10 incorrect passcodes, the means to rapidly try new passcodes in an automated fashion, and the ability to install this software on the target device (that will only accept OS updates via the Apple-authorized standard mechanism.)
What would happen if Apple did this?
The government says Apple would be shown to be a fine, upstanding corporate citizen, this one single solitary special case would be “solved,” and we all go on with our lives content to know that justice was served. Apple can even delete the software they created when they are done. The FBI claims the contents of this employer-owned phone are required to know if the terrorists were communicating with other terrorists in coordinated actions. No other evidence has suggested this happened, so it must be hidden on that particular phone (and not, for example, on the non-work phones that were destroyed or in any of the data on Apple’s servers that they did provide.)
How the law enforcement community is reacting to the prospect of the FBI winning this case
“FBI Says Apple Court Order Is Narrow, But Other Law Enforcers Hungry to Exploit It”
Apple would, first and foremost, be compelled to spend considerable effort on creating a tool to be used by the government. Not just “we’ll hack at it and see what we find” but a testable piece of software that can stand up to being verified at a level sufficient to survive court challenges of its accuracy and reliability. Because if the FBI did find evidence they wanted to use to accuse someone else, that party’s legal team will absolutely question how it was acquired. If that can’t be done, all this effort is wasted.
A discussion of the many, many requirements for building and maintaining a tool suitable for use as as source of evidence in criminal proceedings.
“Apple, FBI, and the Burden of Forensic Methodology”
Next, the probability of this software escaping the confines of Apple’s labs is high. The 3rd-party testing necessary to make the results admissible in court, at absolute minimum, gives anyone in physical possession of a test device access to reverse-engineer the contents. If the FBI has the target device, it too can give it to their security researchers to evaluate. Many people will need access to the software during the course of the investigation.
Finally, everyone in the world would know that Apple, who said they had no way to do this thing, now does. And now that it does, more people will want it. Other governments would love to have this capability and Apple, as a global company, will be pressured to give in. What would that pressure be? In the US, it’s standard for the government to threaten crushing fines or imprisonment of corporate officers for defying the courts. Governments can forbid Apple to sell products in their countries or assess punitive import taxes. Any of these can destroy a company.
Non-technical people often decry security folks as eggheads crying wolf over petty concerns when there are so many more important things to discuss. That’s fine, and our role as professionals includes the responsibility to educate and explain how what we do impacts others.
I encourage you to consider this issue for yourself and what it would mean if you were the person at the other end of that search warrant. Ubiquitous communication and data collection have fundamentally changed how much of the world lives and works, and there are plenty of governments far less principled than our own who want access to private data of mobile phone users. We should not be encouraging them by saying it’s just fine, no problem, go ahead, just hand over the (cryptographic) keys and everything will be ok.