This morning the Senate Judiciary Committee held a hearing on encryption and “lawful access.” That’s the fanciful idea that encryption providers can somehow allow law enforcement access to users’ encrypted data while otherwise preventing the “bad guys” from accessing this very same data.
But the hearing was not inspired by some new engineering breakthrough that might make it possible for Apple or Facebook to build a secure law enforcement backdoor into their encrypted devices and messaging applications. Instead, it followed speeches, open letters, and other public pressure by law enforcement officials in the U.S. and elsewhere to prevent Facebook from encrypting its messaging applications, and more generally to portray encryption as a tool used in serious crimes, including child exploitation. Facebook has signaled it won’t bow to that pressure. And more than 100 organizations including EFF have called on these law enforcement officials to reverse course and avoid gutting one of the most powerful privacy and security tools available to users in an increasingly insecure world.
Many of the committee members seemed to arrive at the hearing convinced that they could legislate secure backdoors. Among others, Senators Graham and Feinstein told representatives from Apple and Facebook that they had a responsibility to find a solution to enable government access to encrypted data. Senator Graham commented, “My advice to you is to get on with it, because this time next year, if we haven’t found a way that you can live with, we will impose our will on you.”
But when it came to questioning witnesses, the senators had trouble establishing the need for or the feasibility of blanket law enforcement access to encrypted data. As all of the witnesses pointed out, even a basic discussion of encryption requires differentiating between encrypting data on a smartphone, also called “encryption at rest,” and end-to-end encryption of private chats, for example.
As a result, the committee’s questioning actually revealed several points that undercut the apocalyptic vision painted by law enforcement officials in recent months. Here are some of our takeaways:
There’s No Such Thing As an Unhackable Phone
The first witness was Manhattan District Attorney Cyrus Vance, Jr., who has called for Apple and Google to roll back encryption in their mobile operating systems. Yet by his own statistics, the DA’s office is able to access the contents of a majority of devices it encounters in its investigations each year. Even for those phones that are locked and encrypted, Vance reported that half could be accessed using in-house forensic tools or services from outside vendors. Although he stressed both the high cost and the uncertainty of these tools, the fact remains that device encryption is far from an insurmountable barrier to law enforcement.
As we saw when the FBI dramatically lowered its own estimate of “unhackable” phones in 2017, the level of security of these devices is not static. Even as Apple and Google patch vulnerabilities that might allow access, vendors like Cellebrite and Grayshift discover new means of bypassing security features in mobile operating systems. Of course, no investigative technique will be completely effective, which is why law enforcement has always worked every angle it can. The cost of forensic tools may be a concern, but they are clearly part of a variety of tools law enforcement use to successfully pursue investigations in a world with widespread encryption.
Lawful Access to Encrypted Phones Would Take Us Back to the Bad Old Days
Meanwhile, even as Vance focused on the cost of forensic tools to access encrypted phones, he repeatedly ignored why companies like Apple began fully encrypting their devices in their first place. In a colloquy with Senator Mike Lee, Apple’s manager of user privacy Erik Neuenschwander explained that the company’s introduction of full disk encryption in iOS in 2014 was a response to threats from hackers and criminals who could otherwise access a wealth of sensitive, unencrypted data on users’ phones. On this point, Neuenschwander explained that Vance was simply misinformed: Apple has never held a key capable of decrypting encrypted data on users’ phones.
Neuenschwander explained that he could think of only two approaches to accomplishing Vance’s call for lawful access, both of which would dramatically increase the risks to consumers. Either Apple could simply roll back encryption on its devices, leaving users exposed to increasingly sophisticated threats from bad actors, or it could attempt to engineer a system where it did hold a master key to every iPhone in the world. Regarding the second approach, Neuenschwander said “as a technologist, I am extremely fearful of the security properties of such a system.” His fear is well-founded; years of research by technologists and cryptographers confirm that key escrow and related systems are highly insecure at the scale and complexity of Apple’s mobile ecosystem.
End-to-End Encryption Is Here to Stay
Finally, despite the heated rhetoric directed by Attorney General Barr and others at end-to-end encryption in messaging applications, the committee found little consensus. Both Vance and Professor Matt Tait suggested that they did not believe that Congress should mandate backdoors in end-to-end encrypted messaging platforms. Meanwhile, Senators Coons, Cornyn, and others expressed concerns that doing so would simply push bad actors to applications hosted outside of the United States, and also aid authoritarian states who want to spy on Facebook users within their own borders. Facebook’s director for messaging privacy Jay Sullivan discussed ways that the company will root out abuse on its platforms while removing its own ability to read users’ messages. As we’ve written before, an encrypted Facebook Messenger is a good thing, but the proof will be in the pudding.
Ultimately, while the Senate Judiciary Committee hearing offered worrying posturing on the necessity of backdoors, we’re hopeful that Congress will recognize what a dangerous idea legislation would be in this area.
Go to Source
Author: Andrew Crocker