A US Federal Court ordered Apple to work to de-encrypt the iPhone of one of the San Bernardino shooters. Tim Cook has pushed back and vowed to fight the order. One suspects this one is going to end up at the Supreme Court and could be one of the most significant decisions for the digital age.
What we have discovered here is that the FBI is actually unable to break in to even the least technical of Apple’s phones — an iPhone 5c — without the cooperation of its user. The phone actually isn’t owned by them but instead by San Bernardino county. That makes it a less messier case because the phone’s owner has given permission to unlock it.
There are two issues at stake — near as I can tell. First, is whether Apple can actually technically comply with the Court order. Second, whether the order is a good thing.
On the first issue, it seems like it can. This post by Dan Guido goes through the motions (backed up by this post). It is complicated but it is possible for Apple to upload some new firmware on to the phone that will allow the FBI to use a brute force method — guessing the pin — to unlock the phone. Without the new firmware that isn’t possible because the phone will be wiped if there are too many incorrect pins. (This is why I suspect the FBI can’t rely on cooperation from the user who could easily obtain that).
But there is an important wrinkle. Apple can only do this for older phones. Newer phones that have Touch ID are more secure. As it happens that does not apply to this particular phone which is an important detail in this affair. Of course, it applies to pretty much all other non-Apple phones so this fight is more than just about Apple.
Which brings us to the second issue: is it a good idea? Apple argue first that knowledge that a phone can be unlocked will demonstrate it is possible and compromise security.
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.
We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.
There is something to this but isn’t it the case that nothing new is known here. This whole order and the fact that experts can work out what to do suggest an issue. That Apple’s engineers actually end up doing it doesn’t change much in terms of people’s trust and security. Suffice it to say, I was surprised that the FBI couldn’t do it. Either Apple can do it or it can’t. Being forced to do it doesn’t change that.
But Apple makes a second argument.
The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.
The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.
Here they are saying that this is a slippery slope. Force them to do something to unlock and it is a few short steps towards an order that requires them to have a backdoor. If that is a requirement then it becomes common knowledge that there is such access and the value of encryption melts away.
The game theory of this is important. Common knowledge requires that we all know and understand at a higher order (I know that you know that I know etc) that a phone can be not be decrypted. Relax that a little and much can unravel. Even if it is a hard thing to do, the knowledge that a backdoor exists is enough to ruin confidence. And with the phone becoming our life and identity, ruining that confidence would have grave unintended consequences. As usual, it won’t harm the criminals but will harm the rest of us.
It is clear from this why Apple is making a stand. I suspect that all like-minded tech companies should join them on it. Otherwise, is unclear how much point there is to investing in secure devices.
Interestingly, the very possibility that technically only the iPhone 5c can be unlocked with Apple’s help but later phones and devices cannot poses an interesting issue. A Court could require unlocking for any device with that possibility but state that if a device cannot be technically unlocked, Apple and others have a right to produce a secure device. If this is a constitutional right, then it opens up the market for innovation in this area even if it ensures that governments will not be able to access devices in the future. In the short-run, Apple would have a huge strategic advantage here and would likely bring the hardware level security to all of its devices.
[Update: as is often the case some more information can comes to light on what is technically feasible. Ben Thompson noted that it may be possible to use the unlocking method for the iPhone 5c on devices with touch ID and a secure enclave. Apparently this information is coming from Apple. This eliminates some of the nuance discussed above, in particular, what is precisely common knowledge or likely to be common knowledge.
The general conclusion on what a government required backdoor for devices would mean for innovation in security remains: that is, it could well kill that market.]
One Reply to “Game Theory and Apple's Encryption Challenge”