Under the regime of Apple’s iOS 7 operating system, a law enforcement agency seeking access to a passcode-protected iPhone would send the phone, along with a valid search warrant, to Apple, who would then be obligated to bypass the passcode using a built-in “backdoor.” Thus, providing the law enforcement agency with the relevant information found in the phone. When iPhone users upgrade to the recently released iOS 8 (or purchase an iPhone 6 or 6 Plus, which come preloaded with the operating system), Apple will lose this ability. Per Apple’s own “Privacy” page:
On devices running iOS 8, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of your passcode. Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it’s not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.
This development has predictably inspired both positive and negative reactions. The law enforcement community has been widely critical of the development. Describing himself as “very concerned,” FBI Director James Comey said: “I am a huge believer in the rule of law, but I also believe that no one in this country is beyond the law. What concerns me about this is companies marketing something expressly to allow people to place themselves beyond the law.”
There are myriad policy considerations in this conversation. Just to name a few: one argument is that iOS 8 also stops oppressive governments from bullying their way into the privacy of citizens, particularly dissident activists; another is that, while technology increases law enforcement’s access to the details of our lives, we should welcome the occasional stone on the “privacy” side of the scale; and a third, that backdoor access to personal data inevitably makes a device more vulnerable to malevolent actors in addition to law enforcement. A fascinating issue from the legal is how law enforcement will respond. In order to decrypt phones, they will have to compel individuals to divulge or enter their passcode.
Apple’s innovation comes after the NSA, according to a leaked PowerPoint slide, found a backdoor into Apple’s servers in order to access private information. It also follows the 2013 Riley v. California U.S. Supreme Court decision in which a unanimous court held unconstitutional the warrantless search of a cellular phone found on an arrestee.
Encryption of personal data has a jurisprudential and public policy history very relevant to this discussion. The 1990s saw what was known as the “crypto wars.” With new developments in personal information encryption, the Federal government sought to combat cryptography and ensure access for law enforcement. In fact, until 1996, high-level encryption software was categorized as a munition by the Federal government, which thus limited (and punished) its export without State Department approval. Around this time, the government developed and attempted to legally mandate the Clipper chip, an encryption device with a built-in backdoor to allow government agencies access to personal computing devices. When it was revealed that the Clipper chip compromised the security of encryption schemes beyond government access, the idea was tossed, along with other mandatory encryption programs.
George Washington University Law School professor Orin Kerr wrote in the Washington Post that the Fifth Amendment right against self-incrimination does not reach the compelled disclosure of one’s cellphone passcode (though Professor Kerr noted in subsequent articles that his position on the iOS 8 issue has evolved, his legal analysis does not appear to have changed). Professor Kerr argues that, when a suspect’s cellphone is found on his or her person, it is essentially self-evident that the cellphone belongs to the suspect and that he or she knows the passcode to the phone. Thus, compelling the suspect to divulge or enter the passcode adds no relevant testimony (thereby avoiding Fifth Amendment concerns), but rather merely allows for the collection of evidence. For authority, Professor Kerr cites In re Boucher, 2009 WL 424718 (D.Vt. 2009), a case in which the defendant was compelled to decrypt files on his computer that the government already knew existed. In his post, Professor Kerr does not discuss what other legal thinkers consider to be a central element of Boucher—that being forced to access the files did not violate Boucher’s Fifth Amendment rights because the government already knew what files they were looking for and that they were contained in the encrypted hard drive. The decision sites both the government’s ability to link the computer to Boucher without his testimony as well as it’s awareness of the files within as relevant to precluding a Fifth Amendment claim:
Again providing access to the unencrypted Z drive ‘adds little or nothing to the sum total of the Government’s information about the existence and location of files that may contain incriminating information.’ Fisher, 425 U.S. at 411. Boucher’s act of producing an unencrypted version of the Z drive likewise is not necessary to authenticate it. He has already admitted to possession of the computer, and provided the Government with access to the Z drive. The Government has submitted that it can link Boucher with the files on his computer without making use of his production of an unencrypted version of the Z drive, and that it will not use his act of production as evidence of authentication.
Jody Goodman, an attorney at the law firm Crowell & Moring, authored an article discussing the Constitutional implications of mandatory decryption. In it, she presents an analogy of an indestructible safe, and offers the proposition that if the government knew what was inside, they could compel a suspect to open the safe or divulge the password, but if not, no such disclosure could be forced:
So in Doe, where government agents didn’t know what was on the computer, the court could not compel Doe to use “the contents of his own mind” to decrypt the data; in Fricosu, on the other hand, Fricosu’s act of decryption would provide the government with the data on the computer, but it would not provide the information that the files existed in the first place. The government already knew that.
Goodman further asserts that technology fits in nicely as the next in a line of “testimony” jurisprudence that began in 1911, which affirms this idea that the Fifth Amendment protection stops when government knows what information it will find when the encryption is broken.
With Apple’s auto-encryption, this question could become more and more relevant. Julian Sanchez of the Cato Institute writes:
Unbreakable encryption is not novel, but the decision to make iOS and Android devices encrypted by default is. Previously, at least, criminals had to be savvy enough to make the choice to use encryption consistently—and many weren’t. Encryption by default, because it protects average crooks as well as sophisticated cybercriminals, is likely to be a practical impediment in many more investigations. Criminals can still be punished for refusing a court order to unlock their devices, but may escape more serious charges that would be provable only with that encrypted evidence.
If government and law enforcement choose to combat Apple’s security measures, we could see a lot of movement in this area of jurisprudence. There is also a chance that the development is simply accepted, and we reach a new norm of personal information security. Either way, Apple continues to steer the way that mainstream society uses technology every day.