Tuesday, October 17, 2017

"Responsible Encryption" - what does that mean?

This weekend I read this excellent article by Alex Gaynor responding to Deputy Attorney General Rod Rosenstein's remarks on encryption to two different audiences last week. Please do go and read it when you get a chance, as it delves into the sadly common tactic of pointing to a bunch of scary criminal incidents, then saying "unbreakable encryption enables criminals and terrorists", without presenting any evidence that those crimes were enabled by encryption technology, or that law enforcement officers were actually hampered in their investigations by encryption.

In fact, in the case of the FBI, Apple, and the San Bernardino shooter, AG Rosenstein repeats all of the same false narrative that we've been presented with before - that the shooter's phone possibly contained vital information, that Apple "could" decrypt the information, and that they fought the FBI's legal attempts to force them to do so. Read my previous blog post (linked above) for background on that line of argument, and how the FBI willfully twists the facts of the case, to try to get something much more far-reaching than what they claim to want.

One thing not addressed directly in Alex's article is the frustration that the FBI and other law enforcement  officials have expressed over the inability to execute a legal search warrant, when they're faced with a locked phone, or a communications system that provides end-to-end encryption.

From Rosenstein's remarks to the Global Security Conference
We use the term “responsible encryption” to describe platforms that allow police to access data when a judge determines that compelling law enforcement concerns outweigh the privacy interests of a particular user.  In contrast, warrant-proof encryption places zero value on law enforcement.  Evidence remains unavailable to the police, no matter how great the harm to victims.
First, what a bunch of emotionally-charged words. And again we see the disconnect between what the FBI and other agencies say that they want (a way to unlock individual phones), and what they seem to keep asking for (a key to unlock any phone they can get their hands on).

But the man does have a point - there is some value to society in the FBI being able to execute a valid search warrant against someone's phone, or to "tap" the communications between known criminals. And I think he's also right that that sort of access is not going to be provided if the free market is allowed to set the rules. It'll never be in Apple's or any individual customer's interest to make it easier to access a locked phone. So, it'll come down to a matter of legislation, and I think it's worth the tech folks having this conversation before Congress sits down with a bill authored by the FBI and the NSA to try to force something on us.

The encryption-in-flight question is very complicated (and crypto protocols are hard to get right - see the recent KRACK security vulnerabilities), so I'll leave that for a future post. I do believe that there are reasonable ways for tech companies to design data-at-rest encryption that is accessible via a court order, but maintains reasonably-good security for customers. Here's a sketch of how one such idea might be implemented:

On-device Key Escrow


Key escrow 
The basic idea of key escrow is that there can be two keys for a particular piece of encrypted data - one key that the user keeps, and one that is kept "in escrow" so another authorized agent can access the data, if necessary. The ill-fated Clipper Chip was an example of such a system. The fatal flaw of Clipper (well, one of them) is that it envisioned every single protected device would have its secondary key  held securely by the government to be used in case of a search warrant being issued. If Clipper had ever seen broad adoption, the value of that centralized key store would have been enormous, both economically and militarily. We're talking a significant fraction of the US GDP, probably trillions of dollars. That would have made it the #1 target of thieves and spies across the world.

Eliminating central key storage
But the FBI really doesn't need the ability to decrypt every phone out there. They need the ability to decrypt specific phones, in response to a valid search warrant. So, how about storing the second key on the device itself? Every current on-device encryption solution that I know of provides for the option of multiple keys. And in fact, briefly getting back to the San Bernardino shooter's phone, if the owners of that phone (San Bernardino County) had had a competent IT department, they would have set up a second key that they could then have handed over to the FBI, neatly avoiding that whole mess with suing Apple.

You could imagine Apple generating a separate "law enforcement" key for every phone, and storing that somewhere, but that has all the same problems as the Clipper central key repository, just on a slightly smaller scale. So those keys need to stored separately. How about storing them on the device itself?

Use secure storage
Not every phone has a "secure enclave" processor like the iPhone, but it's a feature that you'll increasingly see on newer phones, as Apple and other manufacturers try to compete on the basis of providing better privacy protection to their customers. The important feature of these processors is that they don't allow software running on the phone to extract the stored keys. This is what keeps the user's data secure from hackers. So, if the key is stored in there, but the phone software can't get it out, how will the FBI get the key?

Require physical access
My preferred solution would be for the secure enclave to have a physically-disconnected set of pins that can be used just for extracting the second key. In order to extract the key, you'd need to have physical access to the device, disassemble it, and solder some wires on it. This is, I think, sufficiently annoying that nobody would try to do it without getting a warrant first.

It also means that nobody can search your phone without taking it out of your possession for a good long while. This seems like a reasonable trade-off to me. If someone executes a search warrant on your house, you'll certainly know about it. There's such a thing as "sneak and peek" warrants, or delayed-notice warrants, where police sneak in and search your home while you're not there, but I'm not particularly interested in solving that problem for them.

Conclusion
Is this a perfect solution? Of course not. But I think something like this is a reasonable place to start when discussing law enforcement access to personal electronics. And I think we want to have this conversation sooner, rather than later. What do you think?

No comments: