responsible encryption – Techdirt (original) (raw)

Encryption Working Group Releases Paper To 'Move The Conversation Forward'

from the what-conversation? dept

One of the frustrating aspects of the “debate” (if you can call it that) over encryption and whether or not law enforcement should be able to have any kind of “access” is that it’s been no debate at all. You have people who understand encryption who keep pointing out that what is being asked of them is impossible to do without jeopardizing some fairly fundamental security principles, and then a bunch of folks who respond with “well, just nerd harder.” There have been a few people who have suggested, at the very least, that “a conversation” was necessary between the different viewpoints, but mostly when that’s brought up it has meant non-technical law enforcement folks lecturing tech folks on why “lawful access” to encryption is necessary.

However, it appears that the folks at the Carnegie Endowment put together an actual working group of experts with very varying viewpoints to see if there was any sort of consensus or any way to move an actual conversation forward. I know or have met nearly everyone on the working group, and it’s an impressive group of very smart, and thoughtful people — even those I frequently disagree with. It’s a really good group and the paper they’ve now come out with is well worth reading. I don’t know that it actually moves the conversation “forward” because, again, I’m not sure there is any conversation to move forward. But I do appreciate that it got past the usual talking points. The paper kicks off by saying that it’s going to “reject two straw men,” which are basically the two positions frequently stated regarding law enforcement access to encrypted communication:

First of all, we reject two straw men?absolutist positions not actually held by serious participants, but sometimes used as caricatures of opponents?(1) that we should stop seeking approaches to enable access to encrypted information; or (2) that law enforcement will be unable to protect the public unless it can obtain access to all encrypted data through lawful process. We believe it is time to abandon these and other such straw men.

And… that’s fine, in that the first of those statements is not actually the position those who support strong encryption actually hold. I mean, there have been multiple reports detailing how we’re actually in the “golden age of surveillance”, and that law enforcement has so much greater access to basically every bit of communications possible, and that there are plenty of tools and ways to get information that is otherwise encrypted. Yes, it’s true that some information might remain encrypted, but no one has said that law enforcement shouldn’t do their basic detective work in trying to access information. The argument is just that they shouldn’t undermine the basic encryption that protects us all to do so.

Where the paper gets perhaps more interesting is that it suggests that any debate about access to encrypted data should focus on “data at rest” (i.e., data that is encrypted on a device) rather than “data in motion” which is the data that is being transferred across a network or between devices in some form. The paper does not say that we should poke holes in encryption that protects data at rest, and says, explicitly:

We have not concluded that any existing proposal in this area is viable, that any future such proposals will ultimately prove viable, or that policy changes are advisable at this time

However, it does note that if there is a fruitful conversation on this topic, it’s likely to be around data at rest, rather than elsewhere. And, from there it notes that any discussion of proposals for accessing such data at rest must take into account both the costs and the benefits of such access to determine if it is viable. While some of us strongly believe that there is unlikely to ever be a proposal where the costs don’t massively outweigh the benefits, this is the correct framework for analyzing theses things. And it should be noted that, too often, these debates involve one group only talking about the benefits and another only talking about the costs. Having a fruitful discussion requires being willing to measure both.

From there, the group sets up a framework for how to weigh those costs and benefits — including setting up a bunch of use cases against which any proposal should be tested. Again, this seems like the right approach to systematically exploring and stress testing any idea brought forth that claims it will “solve” the “problem” that some in law enforcement insist encryption has created for them. I am extremely skeptical that any such proposal can pass such a stress test in a manner that suggests that the benefits outweigh the costs — but if those pushing to undermine encryption require a “conversation” and want people to explore the few proposals that have been brought up, this is the proper, and rigorous, way to do so.

The question, though, remains as to whether or not this will actually “move the conversation forward.” I have my doubts on that, in part because those who keep pressing for undermining encryption have never appeared to have much interest in actually having this type of conversation. They have mostly only seemed interested in the “nerd harder, nerds” approach to this, that assumes smart techies will give them their magic key without undermining everything else that keeps us secure. I fully expect that it won’t be long before a Willam Barr or Chris Wray or a Richard Burr or a Cy Vance starts talking nonsense again about “going dark” or “responsible encryption” and ignores the framework set out by this working group.

That’s not so say this wasn’t a useful exercise. It likely was, if only to be able to point to it the next time one of the folks listed above spout off again as if there are no tradeoffs and as if it’s somehow easy to solve the “encryption problem” as they see it.

Filed Under: data at rest, encryption, going dark, law enforcement, nerd harder, responsible encryption

Deputy AG Claims There's No Market For Better Security While Complaining About Encryption At A Cybercrime Conference

from the an-actual-thing-that-happened dept

The FBI still hasn’t updated its bogus “uncrackable phones” total yet, but that isn’t stopping the DOJ from continuing its push for holes in encryption. Deputy AG Rod Rosenstein visited Georgetown University to give a keynote speech at its Cybercrime 2020 Conference. In it, Rosenstein again expressed his belief that tech companies are to blame for the exaggerated woes of law enforcement.

Pedophiles teach each other how to evade detection on darknet message boards. Gangs plan murders using social media apps. And extortionists deliver their demands via email. So, it is important for those of us in law enforcement to raise the alarm and put the public on notice about technological barriers to obtaining electronic evidence.

One example of such a barrier is “warrant-proof” encryption, where tech companies design their products or services in such a way that they claim it is impossible for them to assist in the execution of a court-authorized warrant. These barriers are having a dramatic impact on our cases, to the significant detriment of public safety. Technology makers share a duty to comply with the law and to support public safety, not just user privacy.

Rosenstein says this has resulted in a “significant detriment [to] public safety,” but can’t point to any data or evidence to back that claim up. The FBI’s count of devices it can’t access is off by at least a few thousand devices, by most people’s estimates. In terms of this number alone, the “public safety” problem is, at best, only half as bad as the DOJ has led us to believe.

Going beyond that, crime rates remain at historic lows in most places in the country, strongly suggesting no crime wave has been touched off by the advent of default encryption. Law enforcement agencies aren’t complaining about cases they haven’t cleared — if you exclude encryption alarmist/Manhattan DA Cyrus Vance. (Anyone hoping to have an honest conversation about encryption certainly should.)

Somehow, Rosenstein believes the public would experience a net safety gain by making their devices and personal info more easily accessed by criminals. Holes in encryption can be marked “law enforcement only,” much like private property owners can hang “no trespassing” signs. But neither is actually a deterrent to determined criminals.

Rosenstein goes on to tout “responsible encryption” — a fairy tale he created that revolves around the premise tech companies can break/unbreak encryption at the drop of a warrant. But broken encryption can’t be unbroken, not even with some form of key escrow. The attack vector may change, but it still exists.

That Rosenstein is advocating inferior encryption during a cybercrime conference speaks volumes about what the DOJ actually considers to be worth protecting. It’s not businesses and their customers. It’s law enforcement’s access. He spends half the run time talking about security breaches involving tech companies and follows it up by suggesting they take less care securing all this info they collect.

He even goes so far as to claim better security is something customers don’t want and is bad for tech companies’ bottom lines.

Building secure devices requires additional testing and validation—which slows production times — and costs more money. Moreover, enhanced security can sometimes result in less user-friendly products. It is inconvenient to type your zip code when you use a credit card at the gas station, or type a password into your smartphone.

Creating more secure devices risks building a product that will be later to market, costlier, and harder to use. That is a fundamental misalignment of economic incentives and security.

The implicit statement Rosenstein’s making is that ramped-up security — including default encryption — is nothing more than companies screwing shareholders just so they can stick it to The Man. Following this bizarre line of thought is to buy into Rosenstein’s conspiracy theory: one that views tech companies as a powerful cabal capable of rendering US law enforcement impotent.

And as much as Rosenstein hammers tech companies for security breaches that have exposed the wealth of personal data they collect, he ignores the question his encryption backdoor/side door advocacy raises. This question was posed in an excellent post by Cathy Gellis at the beginning of this year:

“What is a company to do if it suffers a data breach and the only thing compromised is the encryption key it was holding onto?”

We’re headed into 2019 and no one in the DOJ or FBI is willing to honestly discuss the side effects of their proposals. Rosenstein clings to his “responsible encryption” myth and the director of the FBI wants to do nothing more than make it the problem of “smart people” at tech companies he’s seeking to bend to his will. No one in the government wants to take responsibility for the adverse outcomes of weakened encryption, but they’re more than willing to blame everyone else any time their access to evidence seems threatened.

Rosenstein’s unwavering stance on the issue makes this statement, made at the closing of his remarks, ring super-hollow.

We should not let ideology or dogma stand in the way of constructive academic engagement.

Fair enough, Rod. You go first.

Filed Under: backdoors, blame, doj, encryption, fbi, going dark, responsible encryption, rod rosenstein

Bill Introduced To Prevent Government Agencies From Demanding Encryption Backdoors

from the pushing-back-from-the-top-down dept

The FBI continues its push for a solution to its “going dark” problem. Joined by the DOJ, agency head Christopher Wray has suggested the only way forward is a legislative or judicial fix, gesturing vaguely to the thousands of locked phones the FBI has gathered. It’s a disingenuous push, considering the tools available to the agency to crack locked devices and obtain the apparently juicy evidence hidden inside.

The FBI hasn’t been honest in its efforts or its portrayal of the problem. Questions put to the FBI about its internal efforts to crack locked devices are still unanswered. The only “new” development isn’t all that new: Ray Ozzie’s “key escrow” proposal may tweak a few details but it’s not that far removed in intent from the Clipper Chip that kicked off the first Crypto War. It’s nothing more than another way to make device security worse, with the only beneficiary being the government.

The FBI’s disingenuousness has not gone unnoticed. Efforts have been made over the last half-decade to push legislators towards mandating government access, but no one has been willing to give the FBI what it wants if it means making encryption less useful. A new bill [PDF], introduced by Zoe Lofgren, Thomas Massie, Ted Poe, Jerry Nadler, Ted Lieu, and Matt Gaetz would codify this resistance to government-mandated backdoors.

The two-page bill has sweeping safeguards that uphold security both for developers and users. As the bill says, “no agency may mandate or request that a manufacturer, developer, or seller of covered products design or alter the security functions in its product or service to allow the surveillance of any user of such product or service, or to allow the physical search of such product, by any agency.”

This bill would protect companies that make encrypted mobile phones, tablets, desktop and laptop computers, as well as developers of popular software for sending end-to-end encrypted messages, including Signal and WhatsApp, from being forced to alter their products in a way that would weaken the encryption. The bill also forbids the government from seeking a court order that would mandate such alterations. The lone exception is for wiretapping standards required under the 1994 Communications for Law Enforcement Act (CALEA), which itself specifically permits providers to offer end-to-end encryption of their services.

The Secure Data Act shouldn’t be needed but the FBI and DOJ have forced the hand of legislators. Rather than take multiple hints dropped by the previous administration, the agencies have only increased the volume of their anti-encryption rhetoric in recent months. Maybe the agencies felt they’d have the ear of the current administration and Congressional majority, but investigations involving the president and his staff have pretty much killed any “law and order” leanings the party normally retains. This bill may see widespread bipartisan support simply because it appears to be sticking it to the Deep State. Whatever. We’ll take it. Hopefully, this makes a short and direct trip to the Oval Office for a signature.

Filed Under: backdoors, congress, doj, encryption, fbi, going dark, jerry nadler, matt gaetz, responsible encryption, secure data act, security, ted lieu, ted poe, thomas massie, zoe lofgren

Software Legend Ray Ozzie Thinks He Can Safely Backdoor Encryption; He's Very Wrong

from the and-dangerous dept

There have been ongoing debates for a while now about the stupidity of backdooring encryption, with plenty of experts explaining why there’s no feasible way to do it without causing all sorts of serious consequences (some more unintended than others). Without getting too deep into the weeds, the basic issue is that cryptography is freaking difficult and if something goes wrong, you’re in a lot of trouble very fast. And it’s very, very easy for something to go wrong. Adding in a backdoor to encryption is, effectively, making something go wrong… on purpose. In doing so, however, you’re introducing a whole host of other opportunities for many, many things to go wrong, blowing up the whole scheme and putting everyone’s information at risk. So, if you’re going to show up with a “plan” to backdoor encryption, you better have a pretty convincing argument for how you avoid that issue (because the reality is you can’t).

For at least a year (probably more) the one name that has kept coming up over and over as one of the few techies who insists that the common wisdom on backdooring encryption is wrong… is Ray Ozzie. Everyone notes that he’s Microsoft’s former Chief Software Architect and CTO, but some of us remember him from way before that when he created Lotus Notes and Groove Networks (which was supposed to be the nirvana of collaboration software). In recent months his name has popped up here and there, often by FBI/DOJ folks seeking to backdoor encryption, as having some possible ways forward.

And, recently, Wired did a big story on his backdoor idea, where he plays right into the FBI’s “nerd harder” trope, by saying exactly what the FBI wants to hear, and which nearly every actual security expert says is wrong:

Ozzie, trim and vigorous at 62, acknowledged off the bat that he was dealing with a polarizing issue. The cryptographic and civil liberties community argued that solving the problem was virtually impossible, which ?kind of bothers me,? he said. ?In engineering if you think hard enough, you can come up with a solution.? He believed he had one.

This, of course, is the same sort of thing that James Comey, Christopher Wray and Rod Rosenstein have all suggested in the past few years: “you techies are smart, if you just nerd harder, you’ll solve the problem.” Ozzie, tragically, is giving them ammo. But he’s not delivering the actual goods.

The Wired story details his plan which is not particularly unique. It takes concepts that others have proposed (and which have been shown to not be particularly secure) and puts a fresh coat of paint on them. Basically, the vendor of a device has a private key that it needs to keep secret, and under some “very special circumstances” it can send an employee into the dark chamber to do the requisite dance, retrieve the code, and give it to law enforcement. That’s been suggested many times, and it’s been explained many times why that opens up all sorts of dangerous scenarios that could put everyone at risk. The one piece that does seem different is that Ozzie wants a sort of limitation on the possible damage his system does if it goes wrong (in one particular way), which is that under his system if the backdoor is used, it can only be used on one phone and then it disables that phone forever:

Ozzie designed other features meant to ?reassure skeptics. Clear works on only one device at a time: Obtaining one phone?s PIN would not give the authorities the means to crack anyone else?s phone. Also, when a phone is unlocked with Clear, a special chip inside the phone blows itself up, freezing the contents of the phone thereafter. This prevents any tampering with the contents of the phone. Clear can?t be used for ongoing surveillance, Ozzie told the Columbia group, because once it is employed, the phone would no longer be able to be used.

So, let’s be clear. That piece isn’t what’s useful in “reassuring skeptics.” That piece is the only thing that really appears to be that unique about Ozzie’s plan. And it hasn’t done much to reassure skeptics. As the report notes, when Ozzie laid this out at a special meeting of super smart folks in the field, it didn’t take long for one to spot a hole:

The most dramatic comment came from computer science professor and cryptographer Eran Tromer. With the flair of Hercule Poirot revealing the murderer, he announced that he?d discovered a weakness. He spun a wild scenario involving a stolen phone, a second hacked phone, and a bank robbery. Ozzie conceded that Tromer found a flaw, but not one that couldn?t be fixed.

“Not one that couldn’t be fixed.” But it took this guy just hearing about the system to find the flaw. There are more flaws. And they’re going to be catastrophic. Because that’s how cryptogrpahy works. Columbia computer science professor and all around computer security genius Steve Bellovin (who was also at that meeting) highlights how Tromer’s flaw-spotting shows why Ozzie’s plan is a fantasy with dangerous consequences:

Ozzie presented his proposal at a meeting at Columbia?I was there?to a diverse group. Levy wrote that Ozzie felt that he had “taken another baby step in what is now a two-years-and-counting quest” and that “he’d started to change the debate about how best to balance privacy and law enforcement access”. I don’t agree. In fact, I think that one can draw the opposite conclusion.

At the meeting,Eran Tromer found a flaw in Ozzie’s scheme: under certain circumstances, an attacker can get an arbitrary phone unlocked. That in itself is interesting, but to me the important thing is that a flaw was found. Ozzie has been presenting his scheme for quite some time. I first heard it last May, at a meeting with several brand-name cryptographers in the audience. No one spotted the flaw. At the January meeting, though, Eran squinted at it and looked at it sideways?and in real-time he found a problem that everyone else had missed. Are there other problems lurking? I wouldn’t be even slightly surprised. As I keep saying, cryptographic protocols are hard.

Bellovin also points out — as others have before — that there’s a wider problem here: how other countries will use whatever stupid example the US sets for much more nefarious purposes:

If the United States adopts this scheme, other countries, including specifically Russia and China, are sure to follow. Would they consent to a scheme that relied on the cooperation of an American company, and with keys stored in the U.S.? Almost certainly not. Now: would the U.S. be content with phones unlockable only with the consent and cooperation of Russian or Chinese companies? I can’t see that, either. Maybe there’s a solution, maybe not?but the proposal is silent on the issue.

And we’re just getting started on how many experts are weighing in on just how wrong Ozzie is. Errata Security’s Rob Graham pulls no punches pointing out that:

He’s only solving the part we already know how to solve. He’s deliberately ignoring the stuff we don’t know how to solve. We know how to make backdoors, we just don’t know how to secure them.

Specifically, Ozzie’s plan relies on the idea that companies can keep their master private key safe. To support that this is possible, Ozzie (as the FBI has in the past) points to the fact that companies like Apple already keep their signing keys secret. And that’s true. But that assumes incorrectly that signing keys and decryption keys are the same thing and can be treated similarly. They’re not and they cannot be. The security protocols around signing keys are intense, but part of that intensity is built around the idea that you almost never have to use a signing key.

A decryption key is a different story altogether, especially with the FBI blathering on about thousands of phones it wants to dig its digital hands into. And, as Graham notes, you quickly run into a scaling issue, and with that scale, you ruin any chance of keeping that key secure.

Yes, Apple has a vault where they’ve successfully protected important keys. No, it doesn’t mean this vault scales. The more people and the more often you have to touch the vault, the less secure it becomes. We are talking thousands of requests per day from 100,000 different law enforcement agencies around the world. We are unlikely to protect this against incompetence and mistakes. We are definitely unable to secure this against deliberate attack.

And, even worse, when that happened, we wouldn’t even know.

If Ozzie’s master key were stolen, nothing would happen. Nobody would know, and evildoers would be able to freely decrypt phones. Ozzie claims his scheme can work because SSL works — but then his scheme includes none of the many protections necessary to make SSL work.

What I’m trying to show here is that in a lab, it all looks nice and pretty, but when attacked at scale, things break down — quickly. We have so much experience with failure at scale that we can judge Ozzie’s scheme as woefully incomplete. It’s not even up to the standard of SSL, and we have a long list of SSL problems.

And so Ozzie’s scheme relies on an impossibility. That you could protect a decryption key that has to be used frequently, the same way that a signing key is currently protected. And that doesn’t work. And when it fails, everyone is seriously fucked.

Graham’s article also notes that Ozzis is — in true nerd harder fashion — focusing on this as a technological problem, ignoring all the human reasons why such a system will fail and such a key won’t be protected.

It focuses on the mathematical model but ignores the human element. We already know how to solve the mathematical problem in a hundred different ways. The part we don’t know how to secure is the human element.

How do we know the law enforcement person is who they say they are? How do we know the “trusted Apple employee” can’t be bribed? How can the law enforcement agent communicate securely with the Apple employee?

You think these things are theoretical, but they aren’t.

Cryptography expert (and professor at Johns Hopkins), Matt Green did a fairly thorough tweetstorm debunking of Ozzie’s plan as well. He also points out, as Graham does, the disaster scenario of what happens when (not if) the key gets out. But, an even bigger point that Green makes is that Ozzie’s plan relies on a special chip in every device… and assumes that we’ll design that chip to work perfectly and never get broken. And that’s ridiculous:

3. Let?s be more clear about this. All Apple phones have a similar chip inside of them. This chip is designed to prevent people from brute-forcing the passcode by limiting the number of attempts you can make.

At present, every one of these chips appears to be completely broken.

— Matthew Green (@matthew_d_green) April 25, 2018

4. Specifically, there is some (as yet unknown) exploit that can completely bypass the internal protections provided by Apple?s Secure Enclave Processor. So effectively ?the chip? Ozzie relies on is now broken. https://t.co/wqoyzfaC2G

— Matthew Green (@matthew_d_green) April 25, 2018

5. When you?re proposing a system that will affect the security of a billion Apple devices, and your proposal says ?assume a lock nobody can break?, you?d better have some plan for building such a lock.

— Matthew Green (@matthew_d_green) April 25, 2018

Green and Graham also both point to the example of GrayKey, the recently reported on tool that law enforcement has been using to crack into all supposedly encrypted iPhones. Already, someone has hacked into the company behind GrayKey and leaked some of the code.

Put it all together and:

8. So let?s recap. We are going to insert a backdoor into billions of devices. It?s security relies on a chip that is now broken. AND the people who broke that chip MAY HAVE LEAKED THEIR CODE TO EXTORTIONISTS ON THE INTERNET.

— Matthew Green (@matthew_d_green) April 25, 2018

Suddenly the fawning over Ozzie’s plan doesn’t look so good any more, does it? And, again, these are the problems that everyone who has dug into why backdoors are a bad idea have pointed out before:

11. Assumes a security technology with yet-to-be-achieved resilience to attacks (insider and outsider) ?

This technology is broken ?

The break is comically accessible even by random criminals, not sophisticated nation state attackers ?

— Matthew Green (@matthew_d_green) April 25, 2018

Green expanded some of his tweets into a blog post as well, which is also worth reading. In it, he also points out that even if we acknowledge the difference between signing keys and decryption keys, companies aren’t even that good at keeping signing keys safe (and those are almost certainly going to be more protected that decryption keys since they need to be access much less frequently):

Moreover, signing keys leak all the time. The phenomenon is so common that journalists have given it a name: it?s called ?Stuxnet-style code signing?. The name derives from the fact that the Stuxnet malware ? the nation-state malware used to sabotage Iran?s nuclear program ? was authenticated with valid code signing keys, many of which were (presumably) stolen from various software vendors. This practice hasn?t remained with nation states, unfortunately, and has now become common in retail malware.

And he also digs deeper into the point he made in his tweetstorm about how on the processor side, not even Apple has been able to keep its secure chip from being broken — yet Ozzie’s plan is based almost entirely on the idea that such an unbreakable chip would be available:

The richest and most sophisticated phone manufacturer in the entire world tried to build a processor that achieved goals similar to those Ozzie requires. And as of April 2018, after five years of trying, they have been unable to achieve this goal ? a goal that is critical to the security of the Ozzie proposal as I understand it.

Now obviously the lack of a secure processor today doesn?t mean such a processor will never exist. However, let me propose a general rule: if your proposal fundamentally relies on a secure lock that nobody can ever break, then it?s on you to show me how to build that lock.

Update: We should add that the criticisms raised here are not new either. Back in February we wrote about a whitepaper by Riana Pfefferkorn making basically all of these same points that the folks quoted above are making. In other words, it’s a bit bizarre that Wired wrote this article as if Ozzie is doing something new and noteworthy.

So that’s a bunch of experts highlighting why Ozzie’s plan is silly. But, from the policy side it’s awful too. Because having Ozzie going around and spouting this debunked nonsense, but with his pedigree, simply gives the “going dark” and “responsible encryption” pundits something to grasp onto to claim they were right all along, even though they weren’t. They’ve said for years that the techies just need to nerd harder, and they will canonize Ray Ozzie as the proof that they were right… even though they’re not and his plan doesn’t solve any of the really hard problems.

And, as we noted much earlier in this post, cryptography is one of those areas where the hard problems really fucking matter. And if Ozzie’s plan doesn’t even touch on most of the big ones, it’s no plan at all. It’s a Potemkin Village that law enforcement types will parade around for the next couple of years insisting that backdoors can be made safely, even though Ozzie’s plan is not safe at all. I am sure that Ray Ozzie means well — and I’ve got tremendous respect for him and have for years. But what he’s doing here is actively harmful — even if his plan is never implemented. Giving the James Comeys and Chris Wrays of the worlds some facade they can cling to to say that this can be done is only going to create many more problems.

Filed Under: encryption, encryption is hard, going dark, key escrow, matthew green, nerd harder, ray ozzie, responsible encryption, rob graham, security, steven bellovin

DOJ Back To Pushing For Legislation Targeting Encryption

from the CLIPPER-CHIP-2K18 dept

The New York Times is reporting that the War on Encryption continues, with a renewed push for legislation the Justice Department couldn’t talk Obama into.

Federal law enforcement officials are renewing a push for a legal mandate that tech companies build tools into smartphones and other devices that would allow access to encrypted data in criminal investigations.

F.B.I. and Justice Department officials have been quietly meeting with security researchers who have been working on approaches to provide such “extraordinary access” to encrypted devices, according to people familiar with the talks.

[…]

Against that backdrop, law enforcement officials have revived talks inside the executive branch over whether to ask Congress to enact legislation mandating the access mechanisms.

FBI Director Chris Wray still has yet to hand over his list of agreeable security experts to Sen. Ron Wyden. Wray continues to assert there’s a way to solve the “going dark” problem that won’t involve make device encryption less secure, but every suggestion he offers involves making device encryption less secure. There are a few techies looking for solutions, and that small group may be who Wray believes can talk legislators into prepping a mandated access bill.

A National Academy of Sciences committee completed an 18-month study of the encryption debate, publishing a report last month. While it largely described challenges to solving the problem, one section cited presentations by several technologists who are developing potential approaches.

They included Ray Ozzie, a former chief software architect at Microsoft; Stefan Savage, a computer science professor at the University of California, San Diego; and Ernie Brickell, a former chief security officer at Intel.

The solutions presented by this group are more of the same: key escrow, weakened encryption, or technological assistance mandates. None of these work out particularly well for customers, as each options provides additional attack vectors for criminals, not just law enforcement. So, even if Wray hopes to rely on more sympathetic tech experts, he’s still going to run into the same facts: you cannot provide access to law enforcement without increasing the chance of access by criminals and state-sponsored hackers.

It appears the DOJ isn’t interested in letting the perfect be the enemy of the good. And why should it? It won’t be affected by mandated access and/or weakened encryption. Those affected most will be members of the general public, and they simply don’t matter when the FBI’s agitating for destroying the encryption the public relies on to keep their devices and communications secure.

[O]ne Justice Department official familiar with the deliberations contended that it might not be necessary to come up with a foolproof system, arguing that a solution that would work for ordinary, less-savvy criminals was still worth pursuing.

Take a long look at that statement. This is the DOJ saying it’s willing to sacrifice the security of millions of Americans to make sure it can round up the nation’s least intelligent criminals. This isn’t a balance anyone outside of the FBI’s inner circle will be happy with. Wray and others routinely claim encryption is preventing them from solving serious crimes and hunting down dangerous criminals, but when all is said and done, it will apparently be satisfied locking up the most inept suspects.

Filed Under: backdoors, doj, encryption, fbi, going dark, legislation, responsible encryption

FBI Director Says It's 'Not Impossible' To Create Compromised Encryption That's Still Secure

from the saying-the-same-thing-over-and-over-doesn't-make-it-true dept

FBI Director Chris Wray was back on the “going dark” stump this week. In a speech [PDF] at Boston College, Wray again stated, without evidence, that it wasn’t impossible to create weakened encryption that isn’t weakened. (via Cyrus Farivar at Ars Technica)

We have a whole bunch of folks at FBI Headquarters devoted to explaining this challenge and working with stakeholders to find a way forward. But we need and want the private sector’s help. We need them to respond to lawfully issued court orders, in a way that is consistent with both the rule of law and strong cybersecurity. We need to have both, and can have both. I recognize this entails varying degrees of innovation by the industry to ensure lawful access is available. But I just don’t buy the claim that it’s impossible.

It really doesn’t matter whether or not Wray “buys” this claim. If you deliberately weaken encryption — either through key escrow or by making it easier to bypass — the encryption no longer offers the protection it did before it was compromised. That’s the thing about facts. They’re not like cult leaders. They don’t need a bunch of true believers hanging around to retain their strength.

Yet Wray continues to believe this can be done. He has yet to provide Senator Ron Wyden with a list of tech experts who feel the same way. The “going dark” part of his remarks is filled with incongruity and non sequiturs. Like this, in which Wray says he doesn’t want backdoors, but rather instant access to encrypted data and communications… almost like a backdoor of some sort.

We’re not looking for a “back door” – which I understand to mean some type of secret, insecure means of access. What we’re asking for is the ability to access the device once we’ve obtained a warrant from an independent judge, who has said we have probable cause.

If by “backdoor,” he means insecure exploit, then he’s technically correct. If by “not a backdoor,” he means another door located on the front or side or connected to the basement or whatever, then what difference does the door’s location really make? A door is door and it provides an opening where there wasn’t one previously.

Solutions have been provided. There’s no shortage of people suggesting workarounds. Metadata is valuable even if Wray continues to downplay it. It’s a weird position for him to take considering the agency’s long reliance on metadata swept up by the NSA. Devices can be hacked, but Wray continues to assert this isn’t a solution either, even after Cellebrite made the stunning announcement it could crack any iPhone, including the latest models. There are a variety of third parties hosting communications in cloud services, all of which could be approached to gain access to at least some evidence. Even public enemy #1, Apple, stores encryption keys for its iCloud services, which would give law enforcement much of what can’t be obtained from a locked device.

Wray doesn’t want a solution that isn’t forced subservience of tech companies. That’s become plainly apparent as he continues his anti-encryption crusade. Tech experts are ignored. Hacking breakthroughs like Cellebrite’s aren’t even cited. Legislators, for the most part, have offered no support for anti-encryption legislation, and yet Wray continues to push for technical access he can’t define and proclaim his rightness despite having no expertise in the subject matter.

He also mentioned the stack of cellphones the agency claims it can’t access — 7,800 devices or more than half of those the FBI tried to access last year. But the number is meaningless. Wray claims they’re all tied to investigations in one way or another, but does not describe what efforts were made to access their contents. Were the phones owners approached and asked for passcodes? Were the phones owners presented with the option of unlocking the devices or facing contempt charges? Were phones sent to Cellebrite or its competitors? Or has the FBI simply shrugged its shoulders, thrown them in a big pile, and decided to let the problem go unaddressed until it has enough legislators on its side?

In this discussion of The 7,800 Phones That Couldn’t Be Broken, Wray mentioned something that shows the FBI won’t be happy until it has mandated access to all encrypted data — not just data at rest on locked devices.

Being unable to access nearly 78-hundred devices is a major public safety issue. That’s more than half of all the devices we attempted to access in that timeframe. And that’s just at the FBI. That’s not even counting devices sought by other law enforcement agencies – our state, local, and foreign counterparts. It also doesn’t count important situations outside of accessing a specific device, like when terrorists, spies, and criminals use encrypted messaging apps to communicate, which is an increasingly widespread problem.

Wray ended his speech as he always does — with emotional appeals meant to throw shade on the tech experts who’ve told him his safely-broken encryption dreams are impossible.

After all, America leads the world in innovation. We have the brightest minds doing and creating fantastic things. A responsible solution will incorporate the best of two great American traditions – the rule of law and innovation. But for this to work, the private sector needs to recognize that it’s part of the solution. Again, I’m open to all kinds of ideas. But I reject this notion that there could be such a place that no matter what kind of lawful authority you have, it’s utterly beyond reach to protect innocent citizens. I also can’t accept that anyone out there reasonably thinks the state of play as it exists now – much less the direction it’s going – is acceptable.

Broken down, his final thoughts on “going dark” run like this:

1. Smart people refuse to help us.

2. They are irresponsible.

3. They are part of the problem.

4. They are making America unsafe.

Christ, what an asshole. The private sector is doing far more to “protect innocent citizens” than the FBI is. Encryption makes communications and data transfer much, much safer. Wray wants this weakened for one reason: to give law enforcement immediate access. Will this make America safer? The answer is no. Default encryption has been available for years now and there’s been no corresponding spike in criminal activity and no loud chorus of united law enforcement officials lamenting their inability to close cases or prosecute people. America’s jails are as full as they’ve ever been and crime rates remain far lower than they were prior to the advent of smartphones and encryption-by-default. It’s only a very small number of law enforcement officials that seem to have a problem with this, but they’re by far the loudest and most visible.

Filed Under: backdoors, chris wray, encryption, fbi, going dark, responsible encryption

Australian Government Continues To Push Encryption Backdoors It Refuses To Call Encryption Backdoors

from the 'we-like-to-call-them-little-miracles' dept

The Australian government has decided it can beat math at its own game. The laws of math will be defeated by the laws of Australia, the government declared last year. In an effort to tackle something this article calls “terror encryption,” the Home Office says laws punching holes in encryption for government access are just around the corner.

Prime Minister Malcolm Turnbull may not understand the laws of mathematics or how signing a bunch of words into law doesn’t actually suspend them, but he does know tech companies are going to figure it out for him. Home Affairs Minister Peter Dutton agrees: the government just needs to mandate broken encryption and the tech companies will handle the rest. It’s for the good of the country, if not the world.

Home Affairs Minister Peter Dutton says ubiquitous encryption – a tool used for secure personal banking platforms and some messaging services – has become a major obstacle to terror investigations.

“We know that more than 90 per cent of counter-terrorism targets are using it for communications, including for attack planning here,” Mr Dutton told the National Press Club in Canberra on Wednesday.

“More than 90 per cent.” That’s seems high! I’m sure it’s based on rigorous examination of facts and probably includes terrorists visiting bank websites or anything else with an HTTPS URL. Dutton wants platform providers and device makers to make it as easy as dropping a wiretap on a phone line, so it’s clear the government isn’t just seeking access to data at rest.

Whatever it is that the Australian government wants, it seems unable to articulate in words. The analogies used (phone wiretaps) suggest the stuff Dutton says he doesn’t want is exactly what he wants.

Mr Dutton said he didn’t want a “backdoor key” to encrypted devices or a licence to hack into services.

But he argued law enforcement access to encrypted communications should be on the same basis as telephone and other intercepts, in response to warrants issued by the court.

If Dutton wants access to ongoing communications on platforms secured with end-to-end encryption, than a backdoor or a golden key is really what he wants, even if he’s unwilling to say so in public. Dutton also suggests companies will be punished for “allowing” terrorists to communicate using their encrypted platforms.

“Companies ought to be concerned with the reputational harm that comes from terrorists and criminals using their encryption and social media platforms for illicit ends,” he said.

_“As a society we should hold these companies responsible when their service is used to plan or facilitate unlawful activity._”

I’m sure Dutton has more in mind than officious bad-mouthing of uncooperative tech companies by government officials. If holes are mandated, companies will be facing more than “reputational damage.” But tut-tutting about scofflaw tech companies isn’t going to budge the public opinion needle. Many people trust their communication platforms far more than their governments. And they value personal security and privacy far more than they value government access to communications, no matter how often the word “terrorism” is deployed as a justification.

Filed Under: australia, backdoors, encryption, responsible encryption, surveillance

Israeli Tech Company Says It Can Crack Any Apple Smartphone

from the thus-endeth-the-going-dark-conversation dept

Could this be the answer to FBI Director Chris Wray’s call for broken device encryption?

In what appears to be a major breakthrough for law enforcement, and a possible privacy problem for Apple customers, a major U.S. government contractor claims to have found a way to unlock pretty much every iPhone on the market.

Cellebrite, a Petah Tikva, Israel-based vendor that’s become the U.S. government’s company of choice when it comes to unlocking mobile devices, is this month telling customers its engineers currently have the ability to get around the security of devices running iOS 11. That includes the iPhone X, a model that Forbes has learned was successfully raided for data by the Department for Homeland Security back in November 2017, most likely with Cellebrite technology.

Big, if true, but not exactly the answer Wray, and others like him, are seeking. Cellebrite claims it can crack any Apple device, including Apple’s latest iPhone. This is a boon for law enforcement, as long as they have the money to spend on it and the time to send the device to Cellebrite to crack it.

It won’t scale because it can’t. The FBI claims it has thousands of locked devices — not all of them Apple products — and no one from Cellebrite is promising fast turnaround times. Even if it was low-cost and relatively scalable, it’s unlikely to keep Wray from pushing for a government mandate. Whatever flaw in the architecture is being exploited by Cellebrite is likely to be patched up by Apple as soon as it can figure out the company’s attack vector. And, ultimately, the fact that it doesn’t scale isn’t something to worry about (though the FBI doubtless will). No one said investigating criminal activity was supposed to easy and, in fact, a handful of Constitutional amendments are in place to slow law enforcement’s roll to prevent the steamrolling of US citizens.

Cellebrite’s service apparently disables lockscreen protection, allowing the company to root around in the phone’s innards to pull out whatever law enforcement is seeking. This also apparently works with Android devices, although that news is far less surprising than discovering Apple’s security measures have been defeated. Default encryption isn’t an option for all Android devices and that operating system is generally considered to be the a pile of vulnerabilities d/b/a consumer software.

While this won’t end calls for weakened encryption, it does at least give law enforcement agencies another option to deploy against locked devices. But I don’t expect it to change the rhetoric. Those calling for “responsible encryption” don’t really want private sector solutions, no matter how much they claim to want to hold a “conversation” about lawful access. They want tech company subservience. They want the government — via judicial, executive, or legislative branch — to put companies in their place. In their opinion, tech companies have been getting uppity and forgetting the private sector exists to serve the government. It’s not just a Chris Wray problem. Plenty of government officials feel the same way. But the complaints about “going dark” are going to ring that much hollower when solutions are being offered by private companies other than the ones the FBI is just dying to smack around.

Filed Under: cracking, encryption, going dark, iphone, privacy, responsible encryption
Companies: apple, cellebrite

Report On Device Encryption Suggests A Few Ways Forward For Law Enforcement

from the time-to-dial-back-the-apocalyptic-narrative dept

Another paper has been released, adding to the current encryption discussion. The FBI and DOJ want access to the contents of locked devices. They call encryption that can be bypassed by law enforcement “responsible encryption.” It isn’t. A recent paper by cryptograpghy expert Riana Pfefferkorn explained in detail how irresponsible these suggestions for broken or weakened encryption are.

This new paper [PDF] was put together by the National Academies of Science, Engineering, and Medicine. (h/t Lawfare) It covers a lot of ground others have and rehashes the history of encryption, along with many of the pro/con arguments. That said, it’s still worth reading. It raises some good questions and spends a great deal of time discussing the multitude of options law enforcement has available, but which are ignored by FBI officials when discussing the backdoors/key escrow/weakened encryption they’d rather have.

The paper points out law enforcement now has access to much more potential evidence than it’s ever had. But that might not always be a good thing.

The widespread use of cloud storage means that law enforcement has another potential source of evidence to turn to when they do not have access to the data on devices, either because the device is unavailable or the data on the device is encrypted. Not all of this digital information will be useful, however. Because storage is cheap or even free, people keep all sorts of non-noteworthy electronic documents forever.

What’s unsaid here is law enforcement should be careful what it wishes for. Encryption that allows government on-demand access may drown it in useless data and documents. If time is of the essence in cases where law enforcement is seeking to prevent further criminal activity, having a golden key may not move things along any faster. I’m sure the FBI and others would prefer access all the same, but this does point to a potential negative side effect of cheap storage and endless data generation.

And the more access law enforcement has, the more chances there are for something to go horribly wrong on the provider’s end.

How frequently might vendors be asked to unlock phones? It is difficult to predict the volume of requests to vendors, but a figure in the tens of thousands per year seems reasonable, given the number of criminal wiretaps per year in the United States and the number of inaccessible devices reported by just the FBI and Manhattan District Attorney’s Office. As a result, each vendor, depending on its market share, needs to be able to handle thousands to tens of thousands of domestic requests per year.

Such a change in scale, as compared to the software update process, would necessitate a change in process and may require a larger number of people authorized to release an unlock code than are authorized to release a software update, which would increase the insider risk.

The paper also runs down stats provided by the FBI and the Manhattan DA’s office. It notes the overall number of unlockable phones has continued to rise but points out these numbers aren’t all that meaningful without context.

In November 11, 2016, testimony to this committee, then-Federal Bureau of Investigation (FBI) General Counsel James Baker reported that for fiscal year 2016, the FBI had encountered passcodes on 2,095 of the 6,814 mobile devices examined by its forensic laboratories. They were able to break into 1,210 of the locked phones, leaving 885 that could not be accessed. The information Baker presented did not address the nature of the crimes involves nor whether the crimes were solved using other techniques.

[…]

Although existing data clearly show that encryption is being encountered with increasing frequency, the figures above do not give a clear picture of how frequently an inability to access information seriously hinders investigations and prosecutions.

It goes on to note that we may never see this contextual information. Any attempt to collect this data would be hindered by law enforcement’s reluctance to provide it, and there are currently no visible efforts being made by agencies to determine just how often encryption stymies investigations. Whatever would actually be reported would be tainted by subjective assessments of encryption’s role in the investigation. However, without more context, the endless parade of locked device figures is nothing more than showmanship in service to the greater goal of undermining encryption.

The paper helpfully lists several options law enforcement can pursue, including approaching cloud services for content stored outside of locked devices. It also points out the uncomfortable fact that law enforcement doesn’t appear to be making use of tools it’s always had available. One of these options is compelled production of passwords or biometric data to unlock phones. While the Fifth Amendment implications of compelled password production are still under debate, it’s pretty clear fingerprints or retinas aren’t going to receive as much Constitutional protection.

On top of that, there’s the fact that a number of device owners have already voluntarily provided copies of encryption keys, and these can likely be accessed by law enforcement using a standard warrant or an All Writs Act order.

[M]any storage encryption products today offer key escrow-like features to avoid data loss or support business record management requirements. For example, Apple’s full disk encryption for the Mac gives the user the option to, in effect, escrow the encryption key. Microsoft Windows’ BitLocker feature escrows the key by default but allows users to request that the escrowed key be deleted. Some point to the existence of such products as evidence that key recovery for stored data can be implemented in a way that sensibly balances risks and benefits at least in certain contexts and against certain threats. In any case, data that is recoverable by a vendor without the user’s passcode can be recovered by the vendor for law enforcement as well. Key escrow-type systems are especially prevalent and useful where the user, or some other authorized person such as the employer, needs access to stored data.

The report also claims law enforcement “had not kept pace” with the increase of digital evidence. It posits the problem is a lack of funding and training. Training is almost certainly a problem, but very few law enforcement agencies — especially those at the federal level — suffer for funding or expertise. This might be due to bad assumptions, where officials believed they would always have full access to device contents (minus occasional end user initiative on encryption). When it became clear they wouldn’t, they began to seek solutions to the problems. This put them a few steps behind. Then there are those, like Manhattan DA Cy Vance and FBI Director Chris Wray, who are putting law enforcement even further behind by pushing for legislation rather than focusing their efforts on keeping officers and agents well-supplied and well-trained.

While the report does suggest vendors and law enforcement work together to solve this access “problem,” the suggestions place the burden on vendors. One suggested fix is one-way information sharing where vendors make law enforcement aware of unpatched exploits, allowing the government (and anyone else who discovers it) to use these vulnerabilities to gain access to communications and data. It’s a horrible suggestion — one that puts vendors in the liability line of fire and encourages continued weakening of device and software security.

The report also points out the calls for harder nerding have been at least partially answered. The proposed solutions aren’t great. In fact, one of them (running lawful access keys and software update keys through the same pipeline) is terrible. But it’s not as though no one on the tech side is trying to come up with a solution.

Several individuals with backgrounds in security and systems have begun to explore possible technical mechanisms to provide government exceptional access. Three individuals presented their ideas to the committee.

• Ernie Brickell, former chief security architect, Intel Corporation, described ways that protected partitions, a security feature provided by future microprocessor architectures, could be used to provide law enforcement access to devices in their physical possession, provide remote access by law enforcement, or provide key escrowed cryptography for use by applications and nonescrowed cryptography for a set of “allowed” applications.

• Ray Ozzie, former chief technical officer and former chief software architect, Microsoft Corporation, argued that if a user trusts a vendor to update software, the user should be able to trust the vendor to manage keys that can provide exceptional access. He proposed that this extension of the trust model used for software updates could be used to provide government exceptional access to unlock mobile devices. Ozzie also provided the committee with materials describing how this approach could be extended to real-time communications such as messaging.

• Stefan Savage, professor of computer science and engineering, University of California, San Diego, described how phone unlock keys could be stored in hardware and made available via an internal hardware interface together with a “proof-of-effort” lock that together would require physical possession and a time delay before law enforcement could unlock a device.

The report points out these are only suggestions and have yet to be rigorously examined by security professionals. But their existence belies the narrative pushed by the FBI in its search for a federal statutory mandate. There are experts trying to help. Unfortunately, every solution proposed is going to require a sacrifice in device security.

The problem is complex, if you choose to believe it’s a problem. It may be troublesome that law enforcement can’t have access to device contents as easily as they could five years ago, but it’s not the threat to public safety anti-encryption enthusiasts like Chris Wray and Cy Vance make it out to be. Encryption use has gone up while crime rates have remained steady or decreased. The emphasis on cellphones as the ultimate investigative goldmine is misplaced. Plenty of options remain and law enforcement spent years solving crimes without having one-stop access to communications and personal documents. An ancient discovery known as “fire” has put evidence out of reach for hundreds of years, but no one’s asking the smart guys at Big Match to come up with a solution. Things are harder but they’re not impossible. What is impossible is what Wray and others are asking for: secure compromised encryption.

Filed Under: doj, encryption, fbi, going dark, responsible encryption

White Paper Points Out Just How Irresponsible 'Responsible Encryption' Is

from the a-hole-for-one-is-a-hole-for-all dept

In recent months, both Deputy Attorney General Rod Rosenstein and FBI Director Christopher Wray have been calling for holes in encryption law enforcement can drive a warrant through. Both have no idea how this can be accomplished, but both are reasonably sure tech companies can figure it out for them. And if some sort of key escrow makes encryption less secure than it is now, so be it. Whatever minimal gains in access law enforcement obtains will apparently offset the damage done by key leaks or criminal exploitation of a deliberately-weakened system.

Cryptography expert Riana Pfefferkorn has released a white paper [PDF] examining the feasibility of the vague requests made by Rosenstein and Wray. Their preferred term is “responsible encryption” — a term that allows them to step around landmines like “encryption backdoors” or “we’re making encryption worse for everyone!” Her paper shows “responsible encryption” is anything but. And, even if implemented, it will result in far less access (and far more nefarious exploitation) than Rosenstein and Wray think.

The first thing the paper does is try to pin down exactly what it is these two officials want — easier said than done because neither official has the technical chops to concisely describe their preferred solutions. Nor do they have any technical experts on board to help guide them to their envisioned solution. (The latter is easily explained by the fact that no expert on cryptography has ever promoted the idea that encryption can remain secure after drilling holes in it at the request of law enforcement.)

If you’re going to respond to a terrible idea like “responsible encryption,” you have to start somewhere. Pfefferkorn starts with an attempt to wrangle vague law enforcement official statements into a usable framework for a reality-based argument.

Rosenstein’s remarks focused more on data at rest than data in transit. For devices, he has not said whether his preferred legislation would cover a range of devices (such as laptop and desktop computers or Internet of Things-enabled appliances), or only smartphones, as in some recent state-level bills. His speeches also leave open whether his preferred legislation would include an exceptional-access mandate for data in transit. As some commentators have pointed out, his proposal is most coherent if read to be limited in scope to mobile device encryption and to exclude data in transit. This paper therefore makes the same assumption.

Wray, meanwhile, discussed both encrypted messaging and encrypted devices in his January 2018 speech. He mentioned “design[ing] devices that both provide data security and permit lawful access” and asked for “the ability to access the device once we’ve obtained a warrant.” Like Rosenstein, he did not specify whether his “responsible solution” would go beyond mobile devices. As to data in transit, he used a financial-sector messaging platform as a real-world example of what a “responsible solution” might look like. Similarly, though, he did not specify whether his “solution” would be restricted to only certain categories of data—for example, communications exchanged through messaging apps (e.g., iMessage, Signal, WhatsApp) but not web traffic (i.e., HTTPS). This paper assumes that Wray’s “solution” would, like Rosenstein’s, encompass encryption of mobile devices, and that it would also cover messaging apps, but not other forms of data in transit.

Either way, there’s no one-size-fits-all approach. This is somewhat ironic given these officials’ resistance to using other methods, like cellphone-cracking tools or approaching third parties for data and communications. According to the FBI (in particular), these solutions “don’t scale.” Well, neither do either of the approaches suggested by the Rosenstein and Wray, although Rosenstein limiting his arguments to data at rest on devices does suggest a somewhat more scalable approach.

The only concrete example given of how key escrow might work to access end-to-end encrypted communications is noted above: a messaging platform used for bank communications. An agreement reached with the New York state government altered the operation of the banking industry’s “Symphony” messaging platform. Banks now hold encrypted communications for seven years but generate duplicate decryption keys which were held by independent parties (neither the banks nor the government). But this analogy doesn’t apply as well as FBI Director Christopher Wray thinks it does.

That agreement was with the banks about changing their use of the platform, not with the developer about changing its design of the platform, which makes it a somewhat inapt example for illustrating how developers should behave “responsibly” when it comes to encryption.

Applied directly, it would be akin to asking cellphone owners to store a copy of a decryption key with an independent party in case law enforcement needed access to the contents of their phone. If several communication platform providers are also involved, then it becomes the generation of several duplicates. What this analogy does not suggest is what Wray and Rosenstein suggest: the duplication or development of decryption keys by manufacturers solely for the purpose of government access.

These officials think this solution scales. And it does. But scaling increases the possibility of the keys falling into the wrong hands, not to mention the increased abuse of law enforcement request portals by criminals to gain access to locked devices and accounts. As Pfefferkorn notes, these are problems Wray and Rosenstein have never addressed. Worse, they’ve never even admitted these problems exist.

What a quasi-escrow system would do is exponentially increase attack vectors for criminals and state-sponsored hacking. Implementing Rosenstein’s suggestion would provide ample opportunities for misuse.

Rosenstein suggests that manufacturers could manage the exceptional-access decryption key the same way they manage the key used to sign software updates. However, that analogy does not hold up. The software update key is used relatively infrequently, by a small number of trusted individuals. Law enforcement’s unlocking demands would be far more frequent. The FBI alone supposedly has been unable to unlock around 7,800 encrypted devices in the space of the last fiscal year. State and local law enforcement agencies, plus those in other countries, up the tally further. There are thousands of local police departments in the United States, the largest of which already amass hundreds of locked smartphones in a year.

Wray’s suggestion isn’t any better. In fact, it’s worse. His proposal (what there is of it) suggests it won’t just be phone manufacturers providing key escrow but also any developer offering end-to-end encrypted communications. This vastly increases the number of key sources. In both cases, developers and manufacturers would need to take on more staff to handle law enforcement requests. This increases the number of people with access to keys, increasing the chances they’ll be leaked, misused, sold, or stolen.

The large number of law enforcement requests headed to key holders poses more problems. Bogus requests are going to start making their way into the request stream, potentially handing access to criminals or other bad actors. While this can be mitigated with hardware storage, the attack vectors remain open.

[A]n attacker could still subvert the controls around the key in order to submit encrypted data to the HSM [hardware security module] for decryption. This is tantamount to having possession of the key itself, without any need to attack the tamper-resistant HSM directly. One way for an attacker to get an HSM to apply the key to its encrypted data input is to make the attacker’s request appear legitimate by subverting the authentication process for exceptional-access demands.

These are just the problems a key escrow system would produce on the supply side. The demand for robust encryption won’t go away. Criminals and non-criminals alike will seek out truly secure platforms and products, taking their business to vendors out of the US government’s reach. At best, forced escrow will be a short-term solution with a whole bunch of collateral damage attached. Domestic businesses will lose sales and other businesses will be harmed as deliberately-introduced holes in encryption allow attackers to exfiltrate intellectual property, trade secrets, conduct industrial espionage, and engage in identity theft.

Wray and Rosenstein tout “responsible encryption.” But their arguments are completely irresponsible. Neither has fully acknowledged how much collateral damage would result from their demands. They’ve both suggested the damage is acceptable even if there is only a minimal gain in law enforcement access. And they’ve both made it clear every negative consequence will be borne by device and service providers — from the additional costs of compliance to the sales lost to competitors still offering uncompromised encryption. There’s nothing “responsible” about their actions or their public statements, but they both believe they’re 100% on the right side of the argument. They aren’t and they’ve made it clear the wants and needs of US citizens will always be secondary to the wants and needs of law enforcement.

Filed Under: doj, encryption, going dark, responsible encryption, rod rosenstein