A backdoor for “us” is a backdoor for “them”

Last March, China’s National People’s Congress wrote a draft for an anti-terrorism bill that required companies to implement encryption backdoors. A spokeswoman from the Chinese Foreign Ministry insisted that the bill was “a requirement for the government in combating terrorism.” Sound familiar?

For those not in the loop, Apple has entered a legal battle with the FBI over decrypting an iPhone. On December 2, Syed Farook and Tashfeen Malik killed 14 people in a shooting at the Inland Regional Center in San Bernardino, CA. Farook owned an iPhone 5C, which was recovered by the FBI. The iPhone is encrypted, and the FBI believes it may have information about the shooting.

The key used for encryption on the iPhone 5C is relatively weak: a combination of a unique hardware ID and a four-digit pin. There are only 10,000 possible combinations, easy to brute-force even with a slow algorithm.

While it would normally be trivial to try 10,000 PINs, iOS has a few security features that make it more difficult. Rate-limiting forces a delay between tries. Every incorrect guess imposes a longer wait before you can try another PIN.

iOS can be configured to wipe data if 10 incorrect passcode attempts are made. Guessing randomly, there is a 0.1% chance of unlocking the phone without wiping its storage. Farook had this option enabled on his iPhone.

The Federal District Court for the District of Central California has ordered Apple to assist the FBI with accessing the contents of Farook’s iPhone. While Apple has been helping with this investigation (as well as previous ones), the court is now asking for them to produce a custom, insecure version of iOS, with rate limiting and data wiping removed. The court order is specific enough to suggest what they’re asking for is technically possible.

Other people with more legal knowledge have written about the case from that perspective. Apple’s lawyers are challenging the court’s questionable interpretation of the All Writs Act, the current version of which was passed in 1911.

I want to look at the precedent this case sets, in the US and abroad. The FBI says it only wants the software for this one phone. With legal precedent, they could request the same software from Apple for many more phones. They could also request similar services from Google, Microsoft, Blackberry, and the developers of other operating systems.

Programming, signing, and deploying a new version of iOS is no doubt a complicated task, taking many work hours. To handle multiple requests, Apple will need to save their weaker code for reuse, rather than tailor it to individual phones. A security breach, either at Apple, a law enforcement agency, or anywhere else in the chain, could put this software in the hands of criminals or foreign governments.

This brings us back to China. If Apple created a backdoored version of iOS, what other countries would have access to it? Google, Yahoo, and many other American technology companies have given up information on political dissidents to the Chinese government in order to do business in the country. In the past several years, Apple has seen massive growth in China. Would they give the software developed for the FBI to the Chinese government?

President Obama criticized the NPC’s mandatory backdoor proposal last year. “This is something that I’ve raised directly with President Xi,” Obama said. “We have made it very clear to them that this is something they are going to have to change if they are to do business with the United States.” The Chinese government ultimately removed the mandatory backdoor requirements from the bill.

There are techniques other than implementing a backdoor that could be used to retrieve the data from Farook’s phone, without weakening every Apple user’s security. Fortunately for China, if Apple loses their appeal, the US seems more than happy to provide a backdoor for them.