Apple Can Access the Terrorist’s Cell Phone Without Creating a Backdoor
Professor Neuman is the director of the Center for Computer Systems Security at the University of Southern California. He submitted this editorial.
Apple has been asked to assist the government in obtaining access to data on the phone of Syed Farook‘s, one of the San Bernardino shooters. Apple has declined to provide assistance, claiming that complying with the request would put the privacy of all of its iPhone users at risk.
That is not the case, and Apple is actually arguing a different issue in the court of public opinion. Let me address the broader debate first, then I will explain why the matter of Farook’s phone is different.
There has been legislation proposed in New York State and in California that would require cellphones sold in those states to be “capable of being decrypted and unlocked by its manufacturer or its operating system provider.” Law enforcement agencies have been pushing for similar legislation at the national level.
As a technologist, I can tell you that such legislation is misguided. Backdoors added to such devices decrease the security of data on the device, not just from the government, but from criminals and insiders. These kinds of backdoors are intentionally inserted vulnerabilities and they make the security mechanisms on the phone more complex and vulnerable to compromise. Further, as more individuals from Apple or government obtain authority to use the backdoor, there will inevitably be cases where that authority is abused for political reasons or personal gain. Foreign governments will similarly demand the capability to use the backdoor.
The matter of Syed Farook’s phone is different, however. In 2014 Apple told its customers “Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data”. While Apple likely believed this statement when they made it, they made an error in implementing this claim. It is possible to load onto a phone a modified operating system that bypasses some security measures,enabling a “brute force” attack against the passcode without concern that 10 wrong guesses will erase the data. Installation of this modified software requires secret information held by Apple to apply their signature on the software.
Apple has argued that complying would put their other customer’s privacy at risk since the modified software would inevitably get out and be used on other phones. On this point I disagree with Apple. The government has asked that the software be written so that it would work only on Sayed Farook’s phone, and if it were subsequently modified to work on other phones, the new version would require a new digital signature from Apple before it could be loaded. Despite Apple’s augments, this case is not about the creation of a backdoor — the backdoor already exists on the subject phone as a vulnerability in Apples implementation. Apple is being asked to assist in the use of that existing backdoor.
In my view, given the nature of the case, Apple has an ethical obligation to comply with the request since Syed Farook’s 4th amendment rights are not at issue, and complying would not create a new backdoor affecting the privacy of Apple’s other customers.
While I believe that Apple should comply, I am not inclined to argue that they should be compelled to comply. At issue on that question is the level of assistance that a company can be compelled to provide. The order was issued on the basis of the “All Writs Act”, and Apple argues that the order infringes on their 1st and 5th amendment rights. The writing of software has been found in case law to be a form of written expression. If Apple is forced to rewrite software, it would constitute compelled speech. Further, Apple argues that under their Fifth Amendment rights they should not be conscripted to undermine the security mechanisms of their own products.
Regarding the level of effort needed by Apple to comply with the request: Apple already creates new version of iOS on a regular basis. Two of the requested changes, disabling the 10 tries and erase functionality, and eliminating any added delay between password guesses would require little effort to “comment out” lines in their existing code. The request to create a new interface for guessing passcodes (USB or Wifi) would require more extensive changes.
I agree with Apple’s arguments that compelling them to comply creates a dangerous precedent. I am not concerned with the dozen or so other requests that Apple points at to demonstrate the floodgates that will open if enabled by such a precedent. I agree that such a precedent could compel them to comply in cases where there is less public support for such an order. My primary concern is that such a precedent could be used to compel a company to add new backdoors to new products, or to software releases that could then then by pushed out “over the air” to other customers. That would constitute the “legal backdoor” to the first issue I already discussed. If Apple were to comply with the request on ethical grounds, no precedents would be set.
Apple wants to claim the moral high ground for taking a stand to protect the privacy of their customers. Apple should be asked how strong their stand is for the privacy of their customers elsewhere. They should be asked whether they have complied or plan to comply with similar requests from foreign governments, and they need to tell us whether they have intentionally included or are considering including backdoors in the phones they sell elsewhere.
Of greater concern to me is that foreign intelligence organizations could target Apple to steal the signature keys needed to compromise the security of their phones. This would give the foreign intelligence services access to data on our phones that not even our own law enforcement can access. parts of the iPhone are manufactured in China. Apple should be asked what measures they have taken to ensure that hardware backdoors have not been inserted into these components which make their way into our phones. It its possible that such backdoors could be included even without Apple’s knowledge. This provides a stronger argument to not provide backdoors in new devices.
In summary, I believe Apple has an ethical obligation to provide the assistance requested with respect to Syed’s phone. Apple should not be compelled to provide such assistance. There is a danger that by taking their stand in this matter they risk losing public support on the issue of backdoors in new devices. I am strongly opposed to such requirements and believe such legislation would be dangerous. As you read this, Apple is making changes to iOS and new iPhones to fix the hole that they left in the iPhone 5C. They are in the right to take that step, and I only hope they are take as strong measures to ensure the protection of our data from disclosure to foreign governments.
[Screengrab via TheGuardian]
Have a tip we should know? firstname.lastname@example.org