rod rosenstein – Techdirt (original) (raw)

Sharyl Attkisson Lawsuit Against Rod Rosenstein Claiming She Was Hacked By Government Tossed

from the crazypants dept

Remember Sharyl Attkisson? If not, she is a former CNN and CBS journalist who made something of a name for herself both in reporting on the Obama administration, often critically, as well as for accusing that same administration of hacking into her computer and home network. Whatever you think of her reporting, her lawsuit against Eric Holder and the Justice Department over the hacking claims was crazy-pants. Essentially, she took a bunch of the same technological glitches all of us deal with on a daily basis — flickering television screens, a stuck backspace key on her computer — and wove that into a giant conspiracy against her and her reporting. She made a big deal in the suit, and her subsequent book on the matter, over some “computer experts” she relied on to confirm that she was a victim of government hacking, except those experts remained largely anonymous and were even, in some cases, third party people she’d never met. For that and other reasons related to how quickly she managed to do initial discovery, the case was tossed by the courts in 2019.

That didn’t stop Attkisson’s crusade against the government, however. In 2020, she filed suit against Rod Rosenstein, again accusing the government of spying on her and her family. To back this up, she again relied on an anonymous source, but that source has since been revealed. And, well…

The source was initially anonymous but later identified by Attkisson’s attorneys as Ryan White, an alleged former FBI informant. White is a QAnon conspiracy adherent who appears to have been the source of bizarre child-abuse allegations that Georgia attorney Lin Wood leveled at Chief Justice John Roberts last year, according to a report in the Daily Beast.

And so here we are yet again, with an extremely serious claim lodged against the federal government that relies on the tinfoil hat crowd as “evidence.” In addition, Attkisson lays out again the computer and network hacking claims, with a named “computer forensic” expert who apparently told her that there was spyware on her machine, that they had logs for where these breaches originated (such as a Ritz Carlton hotel), and that the tools used for all of this appeared to be the sort typically only available to government actors. And here too, just as in her original lawsuit, there are tons of details and claims that reveal that, like so many other conspiracy theories, there is a duality problem. Namely, that the federal government is so nefarious and great at hacking that they completely compromised nearly every machine Attkisson used at work and at home, but that same federal government was too stupid to mask the IP address from which it launched these attacks.

For example, her suit claims that these attacks were originally launched from the United States Postal Service in Baltimore, where some staff involved in infiltrating The Silk Road worked. The contention of her Qanon witness is that the spying on Attkisson somehow happened as an offshoot of a multi-agency task force against dark web dealings. And to believe all of that, you again have to believe that the government’s l337 h4x0rs didn’t bother to cover their USPS tracks.

But those are conversations about the merits of Attkisson’s case. We don’t really need to get that far, because her suit has again been tossed on essentially procedural grounds.

Bennett, an appointee of President George W. Bush, also ruled that there was inadequate indication that any surveillance of Attkisson involved activities in Maryland, which Bennett’s court has jurisdiction over.

“The Amended Complaint is devoid of any factual allegations with respect to actual conduct related to the alleged surveillance which occurred in Maryland,” Bennett wrote in his 20-page decision, issued on Tuesday. “The conclusory statements that the alleged surveillance was performed by individuals in Maryland, unsupported by any factual allegations, lie in contrast to the Plaintiffs’ numerous assertions regarding conduct performed and events which occurred in the Eastern District of Virginia.”

So, on the one hand, it’s not as if the court is saying that Attkisson’s claims are nonsense. And maybe this will lead to her refiling her lawsuit in the proper jurisdiction. On the other hand, it doesn’t inspire a great deal of confidence in the merits of her claims or her legal team that they can’t even get the case filed in the correct jurisdiction.

So, do I think this is the last we’ll hear from Sharyl Attkisson’s lawsuits over the supposed hacking of all her things? No, I doubt it. After all, she must certainly have another book to write and promote soon.

Filed Under: conspiracy theories, doj, rod rosenstein, sharyl attkisson, surveillance, usps

Deputy AG Claims There's No Market For Better Security While Complaining About Encryption At A Cybercrime Conference

from the an-actual-thing-that-happened dept

The FBI still hasn’t updated its bogus “uncrackable phones” total yet, but that isn’t stopping the DOJ from continuing its push for holes in encryption. Deputy AG Rod Rosenstein visited Georgetown University to give a keynote speech at its Cybercrime 2020 Conference. In it, Rosenstein again expressed his belief that tech companies are to blame for the exaggerated woes of law enforcement.

Pedophiles teach each other how to evade detection on darknet message boards. Gangs plan murders using social media apps. And extortionists deliver their demands via email. So, it is important for those of us in law enforcement to raise the alarm and put the public on notice about technological barriers to obtaining electronic evidence.

One example of such a barrier is “warrant-proof” encryption, where tech companies design their products or services in such a way that they claim it is impossible for them to assist in the execution of a court-authorized warrant. These barriers are having a dramatic impact on our cases, to the significant detriment of public safety. Technology makers share a duty to comply with the law and to support public safety, not just user privacy.

Rosenstein says this has resulted in a “significant detriment [to] public safety,” but can’t point to any data or evidence to back that claim up. The FBI’s count of devices it can’t access is off by at least a few thousand devices, by most people’s estimates. In terms of this number alone, the “public safety” problem is, at best, only half as bad as the DOJ has led us to believe.

Going beyond that, crime rates remain at historic lows in most places in the country, strongly suggesting no crime wave has been touched off by the advent of default encryption. Law enforcement agencies aren’t complaining about cases they haven’t cleared — if you exclude encryption alarmist/Manhattan DA Cyrus Vance. (Anyone hoping to have an honest conversation about encryption certainly should.)

Somehow, Rosenstein believes the public would experience a net safety gain by making their devices and personal info more easily accessed by criminals. Holes in encryption can be marked “law enforcement only,” much like private property owners can hang “no trespassing” signs. But neither is actually a deterrent to determined criminals.

Rosenstein goes on to tout “responsible encryption” — a fairy tale he created that revolves around the premise tech companies can break/unbreak encryption at the drop of a warrant. But broken encryption can’t be unbroken, not even with some form of key escrow. The attack vector may change, but it still exists.

That Rosenstein is advocating inferior encryption during a cybercrime conference speaks volumes about what the DOJ actually considers to be worth protecting. It’s not businesses and their customers. It’s law enforcement’s access. He spends half the run time talking about security breaches involving tech companies and follows it up by suggesting they take less care securing all this info they collect.

He even goes so far as to claim better security is something customers don’t want and is bad for tech companies’ bottom lines.

Building secure devices requires additional testing and validation—which slows production times — and costs more money. Moreover, enhanced security can sometimes result in less user-friendly products. It is inconvenient to type your zip code when you use a credit card at the gas station, or type a password into your smartphone.

Creating more secure devices risks building a product that will be later to market, costlier, and harder to use. That is a fundamental misalignment of economic incentives and security.

The implicit statement Rosenstein’s making is that ramped-up security — including default encryption — is nothing more than companies screwing shareholders just so they can stick it to The Man. Following this bizarre line of thought is to buy into Rosenstein’s conspiracy theory: one that views tech companies as a powerful cabal capable of rendering US law enforcement impotent.

And as much as Rosenstein hammers tech companies for security breaches that have exposed the wealth of personal data they collect, he ignores the question his encryption backdoor/side door advocacy raises. This question was posed in an excellent post by Cathy Gellis at the beginning of this year:

“What is a company to do if it suffers a data breach and the only thing compromised is the encryption key it was holding onto?”

We’re headed into 2019 and no one in the DOJ or FBI is willing to honestly discuss the side effects of their proposals. Rosenstein clings to his “responsible encryption” myth and the director of the FBI wants to do nothing more than make it the problem of “smart people” at tech companies he’s seeking to bend to his will. No one in the government wants to take responsibility for the adverse outcomes of weakened encryption, but they’re more than willing to blame everyone else any time their access to evidence seems threatened.

Rosenstein’s unwavering stance on the issue makes this statement, made at the closing of his remarks, ring super-hollow.

We should not let ideology or dogma stand in the way of constructive academic engagement.

Fair enough, Rod. You go first.

Filed Under: backdoors, blame, doj, encryption, fbi, going dark, responsible encryption, rod rosenstein

White Paper Points Out Just How Irresponsible 'Responsible Encryption' Is

from the a-hole-for-one-is-a-hole-for-all dept

In recent months, both Deputy Attorney General Rod Rosenstein and FBI Director Christopher Wray have been calling for holes in encryption law enforcement can drive a warrant through. Both have no idea how this can be accomplished, but both are reasonably sure tech companies can figure it out for them. And if some sort of key escrow makes encryption less secure than it is now, so be it. Whatever minimal gains in access law enforcement obtains will apparently offset the damage done by key leaks or criminal exploitation of a deliberately-weakened system.

Cryptography expert Riana Pfefferkorn has released a white paper [PDF] examining the feasibility of the vague requests made by Rosenstein and Wray. Their preferred term is “responsible encryption” — a term that allows them to step around landmines like “encryption backdoors” or “we’re making encryption worse for everyone!” Her paper shows “responsible encryption” is anything but. And, even if implemented, it will result in far less access (and far more nefarious exploitation) than Rosenstein and Wray think.

The first thing the paper does is try to pin down exactly what it is these two officials want — easier said than done because neither official has the technical chops to concisely describe their preferred solutions. Nor do they have any technical experts on board to help guide them to their envisioned solution. (The latter is easily explained by the fact that no expert on cryptography has ever promoted the idea that encryption can remain secure after drilling holes in it at the request of law enforcement.)

If you’re going to respond to a terrible idea like “responsible encryption,” you have to start somewhere. Pfefferkorn starts with an attempt to wrangle vague law enforcement official statements into a usable framework for a reality-based argument.

Rosenstein’s remarks focused more on data at rest than data in transit. For devices, he has not said whether his preferred legislation would cover a range of devices (such as laptop and desktop computers or Internet of Things-enabled appliances), or only smartphones, as in some recent state-level bills. His speeches also leave open whether his preferred legislation would include an exceptional-access mandate for data in transit. As some commentators have pointed out, his proposal is most coherent if read to be limited in scope to mobile device encryption and to exclude data in transit. This paper therefore makes the same assumption.

Wray, meanwhile, discussed both encrypted messaging and encrypted devices in his January 2018 speech. He mentioned “design[ing] devices that both provide data security and permit lawful access” and asked for “the ability to access the device once we’ve obtained a warrant.” Like Rosenstein, he did not specify whether his “responsible solution” would go beyond mobile devices. As to data in transit, he used a financial-sector messaging platform as a real-world example of what a “responsible solution” might look like. Similarly, though, he did not specify whether his “solution” would be restricted to only certain categories of data—for example, communications exchanged through messaging apps (e.g., iMessage, Signal, WhatsApp) but not web traffic (i.e., HTTPS). This paper assumes that Wray’s “solution” would, like Rosenstein’s, encompass encryption of mobile devices, and that it would also cover messaging apps, but not other forms of data in transit.

Either way, there’s no one-size-fits-all approach. This is somewhat ironic given these officials’ resistance to using other methods, like cellphone-cracking tools or approaching third parties for data and communications. According to the FBI (in particular), these solutions “don’t scale.” Well, neither do either of the approaches suggested by the Rosenstein and Wray, although Rosenstein limiting his arguments to data at rest on devices does suggest a somewhat more scalable approach.

The only concrete example given of how key escrow might work to access end-to-end encrypted communications is noted above: a messaging platform used for bank communications. An agreement reached with the New York state government altered the operation of the banking industry’s “Symphony” messaging platform. Banks now hold encrypted communications for seven years but generate duplicate decryption keys which were held by independent parties (neither the banks nor the government). But this analogy doesn’t apply as well as FBI Director Christopher Wray thinks it does.

That agreement was with the banks about changing their use of the platform, not with the developer about changing its design of the platform, which makes it a somewhat inapt example for illustrating how developers should behave “responsibly” when it comes to encryption.

Applied directly, it would be akin to asking cellphone owners to store a copy of a decryption key with an independent party in case law enforcement needed access to the contents of their phone. If several communication platform providers are also involved, then it becomes the generation of several duplicates. What this analogy does not suggest is what Wray and Rosenstein suggest: the duplication or development of decryption keys by manufacturers solely for the purpose of government access.

These officials think this solution scales. And it does. But scaling increases the possibility of the keys falling into the wrong hands, not to mention the increased abuse of law enforcement request portals by criminals to gain access to locked devices and accounts. As Pfefferkorn notes, these are problems Wray and Rosenstein have never addressed. Worse, they’ve never even admitted these problems exist.

What a quasi-escrow system would do is exponentially increase attack vectors for criminals and state-sponsored hacking. Implementing Rosenstein’s suggestion would provide ample opportunities for misuse.

Rosenstein suggests that manufacturers could manage the exceptional-access decryption key the same way they manage the key used to sign software updates. However, that analogy does not hold up. The software update key is used relatively infrequently, by a small number of trusted individuals. Law enforcement’s unlocking demands would be far more frequent. The FBI alone supposedly has been unable to unlock around 7,800 encrypted devices in the space of the last fiscal year. State and local law enforcement agencies, plus those in other countries, up the tally further. There are thousands of local police departments in the United States, the largest of which already amass hundreds of locked smartphones in a year.

Wray’s suggestion isn’t any better. In fact, it’s worse. His proposal (what there is of it) suggests it won’t just be phone manufacturers providing key escrow but also any developer offering end-to-end encrypted communications. This vastly increases the number of key sources. In both cases, developers and manufacturers would need to take on more staff to handle law enforcement requests. This increases the number of people with access to keys, increasing the chances they’ll be leaked, misused, sold, or stolen.

The large number of law enforcement requests headed to key holders poses more problems. Bogus requests are going to start making their way into the request stream, potentially handing access to criminals or other bad actors. While this can be mitigated with hardware storage, the attack vectors remain open.

[A]n attacker could still subvert the controls around the key in order to submit encrypted data to the HSM [hardware security module] for decryption. This is tantamount to having possession of the key itself, without any need to attack the tamper-resistant HSM directly. One way for an attacker to get an HSM to apply the key to its encrypted data input is to make the attacker’s request appear legitimate by subverting the authentication process for exceptional-access demands.

These are just the problems a key escrow system would produce on the supply side. The demand for robust encryption won’t go away. Criminals and non-criminals alike will seek out truly secure platforms and products, taking their business to vendors out of the US government’s reach. At best, forced escrow will be a short-term solution with a whole bunch of collateral damage attached. Domestic businesses will lose sales and other businesses will be harmed as deliberately-introduced holes in encryption allow attackers to exfiltrate intellectual property, trade secrets, conduct industrial espionage, and engage in identity theft.

Wray and Rosenstein tout “responsible encryption.” But their arguments are completely irresponsible. Neither has fully acknowledged how much collateral damage would result from their demands. They’ve both suggested the damage is acceptable even if there is only a minimal gain in law enforcement access. And they’ve both made it clear every negative consequence will be borne by device and service providers — from the additional costs of compliance to the sales lost to competitors still offering uncompromised encryption. There’s nothing “responsible” about their actions or their public statements, but they both believe they’re 100% on the right side of the argument. They aren’t and they’ve made it clear the wants and needs of US citizens will always be secondary to the wants and needs of law enforcement.

Filed Under: doj, encryption, going dark, responsible encryption, rod rosenstein

My Question To Deputy Attorney General Rod Rosenstein On Encryption Backdoors

from the golden-key-and-databreach dept

Never mind all the other reasons Deputy Attorney General Rod Rosenstein’s name has been in the news lately… this post is about his comments at the State of the Net conference in DC on Monday. In particular: his comments on encryption backdoors.

As he and so many other government officials have before, he continued to press for encryption backdoors, as if it were possible to have a backdoor and a functioning encryption system. He allowed that the government would not itself need to have the backdoor key; it could simply be a company holding onto it, he said, as if this qualification would lay all concerns to rest.

But it does not, and so near the end of his talk I asked the question, “What is a company to do if it suffers a data breach and the only thing compromised is the encryption key it was holding onto?”

There were several concerns reflected in this question. One relates to what the poor company is to do. It’s bad enough when they experience a data breach and user information is compromised. Not only does a data breach undermine a company’s relationship with its users, but, recognizing how serious this problem is, authorities are increasingly developing policy instructing companies on how they are to respond to such a situation, and it can expose the company to significant legal liability if it does not comport with these requirements.

But if an encryption key is taken it is so much more than basic user information, financial details, or even the pool of potentially rich and varied data related to the user’s interactions with the company that is at risk. Rather, it is every single bit of information the user has ever depended on the encryption system to secure that stands to be compromised. What is the appropriate response of a company whose data breach has now stripped its users of all the protection they depended on for all this data? How can it even begin to try to mitigate the resulting harm? Just what would government officials, who required the company to keep this backdoor key, now propose it do? Particularly if the government is going to force companies to be in this position of holding onto these keys, these answers are something they are going to need to know if they are going to be able to afford to be in the encryption business at all.

Which leads to the other idea I was hoping the question would capture: that encryption policy and cybersecurity policy are not two distinct subjects. They interrelate. So when government officials worry about what bad actors do, as Rosenstein’s comments reflected, it can’t lead to the reflexive demand that encryption be weakened simply because, as they reason, bad actors use encryption. Not when the same officials are also worried about bad actors breaching systems, because this sort of weakened encryption so significantly raises the cost of these breaches (as well as potentially makes them easier).

Unfortunately Rosenstein had no good answer. There was lots of equivocation punctuated with the assertion that experts had assured him that it was feasible to create backdoors and keep them safe. Time ran out before anyone could ask the follow-up question of exactly who were these mysterious experts giving him this assurance, especially in light of so many other experts agreeing that such a solution is not possible, but perhaps this answer is something Senator Wyden can find out

Filed Under: cybersecurity, encryption backdoors, going dark, responsible encryption, rod rosenstein

Latest DOJ WTFness: Encryption Is Like A Locked House That Won't Let Its Owners Back Inside

from the spare-the-Rod,-spoil-the-horse-carcass dept

Deputy Attorney General Rod Rosenstein continues his push for law enforcement-friendly broken encryption. The ultimate goal is the same but the arguments just keep getting worse. Trying to pitch worthless encryption (i.e., encryption easily compromised in response to government demands) as “responsible” encryption is only the beginning of Rosenstein’s logical fallacies.

After a month-plus of bad analogies and false equivalents, Rosenstein has managed to top himself. The path to Rosenstein’s slaughtering of a metaphor runs through such highlights as the DAG claiming device encryption is solely motivated by profits and that this is the first time in history law enforcement hasn’t had access to all forms of evidence. It’s an intellectually dishonest campaign against encryption, propelled by the incredibly incorrect belief that the Fourth Amendment was written to provide the government with access, rather than to protect citizens from their government.

In a long article by Cyrus Farivar discussing a recent interview given by Rosenstein, the Deputy Attorney General drops this abomination of an analogy:

“I favor strong encryption, because the stronger the encryption, the more secure data is against criminals who are trying to commit fraud,” he explained. “And I’m in favor of that, because that means less business for us prosecuting cases of people who have stolen data and hacked into computer networks and done all sorts of damage. So I’m in favor of strong encryption.”

“This is, obviously, a related issue, but it’s distinct, which is, what about cases where people are using electronic media to commit crimes? Having access to those devices is going to be critical to have evidence that we can present in court to prove the crime. I understand why some people merge the issues. I understand that they’re related. But I think logically, we have to look at these differently. People want to secure their houses, but they still need to get in and out. Same issue here.”

It is nowhere near the “same issue.” I sincerely hope DAG Rosenstein regrets every word of this statement.

Let’s streamline the analogy: People want to protect the data on their phones. People still want to be able to access this data on their phones. In no case ever has encryption prevented people from accessing the data on their phones. Forgotten passcodes might, but that’s like losing house keys. You might need outside assistance to get back in.

Rosenstein’s analogy skips a step. It has to. There’s no way this analogy can ever work couched in Rosenstein’s anti-encryption statements. People lock their houses when they leave and unlock them with their keys when they get back. Rosenstein’s analogy is completely baffling, given the context of his remarks. How does strong security prevent people from “entering” their devices? It doesn’t and Rosenstein knows this. It only prevents people other than the device owner from doing so.

What he’s actually talking about is government access, but he can’t find a credible argument for weakening the strong encryption he just claimed he believed in. And he doesn’t have the intellectual honesty to say what he really means. The “they” in “but they still need to get in and out” is meant to encompass law enforcement agencies. In the context of Rosenstein’s anti-encryption argument, that’s the only interpretation that makes any sort of sense. Otherwise, it’s a non sequitur — one that claims strong security is somehow capable of preventing home owners from coming and going as they please.

A boneheaded analogy like this is the only rhetorical option left. That’s because what Rosenstein wants — easily-compromised “strong” encryption (i.e., “responsible encryption”) — simply cannot exist. Impossible demands can only be justified by implausible arguments. Given the swift and steady deterioration of Rosenstein’s rhetoric, it’s probably time to put his “Dead Horses and the Men Who Beat Them” show on ice.

Filed Under: bad analogies, doj, encryption, going dark, responsible encryption, rod rosenstein

DOJ: Civil Asset Forfeiture Is A Good Thing That Only Harms All Those Criminals We Never Arrest

from the nothing-good-to-say-but-all-the-space-in-the-world-to-say-it dept

Deputy Attorney General Rod Rosenstein has taken a brief vacation from his “Responsible Encryption World Tour” to defend the merits of something equally questionable: civil asset forfeiture. [h/t Meaghan Ybos]

As is the case with any article defending the practice of taking “guilty” stuff from people without even bothering to determine whether the people were actually guilty of anything, Rosenstein’s WSJ editorial glosses over the thousands of abuses to home in on a high profile case: the prosecution of Bernie Madoff.

Thanks to civil asset forfeiture, the Department of Justice is announcing today the record-setting distribution of restitution to victims of Bernard Madoff’s notorious investment fraud scheme. We have recovered $3.9 billion from third parties—not Mr. Madoff—and are now returning that money to more than 35,000 victims. This is the largest restoration of forfeited property in history. Civil forfeiture has allowed the government to seize those illicit proceeds and return them to Mr. Madoff’s victims.

To be clear, assets taken from Madoff were seized via criminal asset forfeiture, which requires a conviction. Rosenstein’s decision to open with this glosses over this difference, allowing the reader to think civil/criminal asset forfeiture are barely distinct entities. His op-ed doesn’t actually say how much of that $3.9 billion came from civil asset forfeiture — a process that has nothing to do with a criminal prosecution like Madoff’s.

From there, Rosenstein says the expected stuff: civil asset forfeiture is just a way of crippling criminal enterprises, despite it being predicated on one-sided accusations about the allegedly illegitimate origin of seized property and tied to a judicial process that discourages citizens from attempting to reclaim their possessions.

The opening paragraph also makes it appear as though civil asset forfeiture is often used to return unlawfully obtained assets to victims of crime. Nothing could be further from the truth. While this occasionally happens in criminal forfeiture cases, the lack of criminal charges in civil forfeiture cases makes it extremely unlikely there will be any “victims” to “return” seized assets to.

In most cases, the agency performing the seizure is allowed to directly benefit from it. Whether it’s used to pay for new equipment or offset investigatory expenses, seized property rarely ends up back in the hands of victims.

But you won’t be hearing any of that in Rosenstein’s pro-forfeiture pep talk. Instead, he presents civil forfeiture as a skillfully-wielded scalpel, rather than the property-grabbing cudgel it actually is.

Some critics claim that civil asset forfeiture fails to protect property rights or provide due process. The truth is that there are multiple levels of judicial protection, as well as administrative safeguards.

First, money or property cannot be seized without a lawful reason. The evidence must be sufficient to establish probable cause to believe a crime was committed. That is the same standard needed to justify an arrest.

Second, if anyone claims ownership of the property, it may be forfeited only if the government presents enough evidence in court to establish by a preponderance of the evidence it was the proceeds of crime, or was used to commit a crime.

Courts apply the “beyond a reasonable doubt” standard only in criminal cases. That high threshold of proof is appropriate when the stakes involve a person’s criminal record and potential imprisonment. But all other lawsuits, no matter how much money is at issue, use the normal civil standard. There is no logical reason to demand the elevated criminal standard in a lawsuit about illicit proceeds.

First, the money can be seized for any reason, with justification supplied after the fact. Stating law enforcement needs “probable cause” to seize property is simply untrue. Rosenstein knows this because he points out the standard of evidence needed to secure the forfeiture is actually lower than the standard needed to secure a warrant: “preponderance of evidence.” If probable cause were actually needed, drivers and travelers wouldn’t have to worry nearly as much about having their cash seized by highway patrol officers during traffic stops or by DEA agents while passing through airport security. Pretextual stops and scanning passenger manifests for one-way ticket purchases are no one’s idea of “probable cause.”

Furthermore, if the standard of evidence needed prior to seizure was actually the same as the requirement to secure an arrest warrant, more seizure victims would be arrested. But they’re not. They’re usually free to go, minus whatever law enforcement officers have taken from them.

As for the last part, Rosenstein is right: we shouldn’t need to change the standard of evidence in civil cases. But that’s not where the change is needed. If property is being taken from criminals — as Rosenstein and other forfeiture supporters claim — then all seizures should be of the criminal variety: a conviction should be required. This leaves civil lawsuit evidence requirements unchanged… just the way Rosenstein prefers it.

Then there’s this, which Rosenstein offers up as some sort of proof that the government is in the right at least 80% of the time when it takes property from citizens without charging them with crimes:

About 80% of the time, nobody even tries to claim the seized assets.

Well, let’s look at this. Rosenstein talks billions in his Madoff anecdote, but the reality of civil asset forfeiture is a majority of seizures fall well under the $1,000 mark. Considering the long, uphill battle facing forfeiture victims, anything short of several thousands dollars usually isn’t worth the effort. In those cases, the expenses of challenging the forfeiture would outweigh the the value of the property recovered. This is a stupid stat that proves nothing.

The years of documentation of widespread forfeiture abuse by law enforcement agencies? It’s reduced to this by the Deputy Attorney General:

To be sure, law-enforcement officers sometimes make mistakes.

Come on, Rod. This is just embarrassing. You want the private sector to trust you and get on board with DOJ encryption key escrow, etc.? Maybe stop lying to the public. Maybe discontinue this gross minimization of repeated, abusive law enforcement behavior. Maybe do something more to curb forfeiture abuse. Hell, try doing anything at all. The only thing the DOJ has done in recent months is open back up the federal forfeiture adoption program — something that has been abused for years by law enforcement agencies looking to route around restrictive state laws.

It’s unsurprising the DOJ would argue publicly that civil asset forfeiture is A-OK and good for America. It’s just unsettling that the arguments are this bad.

Filed Under: bernie madoff, civil asset forfeiture, crimes, doj, rod rosenstein

Back Down The Rabbit Hole About Encryption On Smartphones

from the the-rule-of-law dept

Deputy Attorney General Rod Rosenstein wrote the disapproving memo that President Trump used as a pretext to fire FBI Director James Comey in May. But on at least one area of law-enforcement policy, Rosenstein and Comey remain on the same page—the Deputy AG set out earlier this month to revive the outgoing FBI director’s efforts to limit encryption and other digital security technologies. In doing so, Rosenstein has drawn upon nearly a quarter century of the FBI’s anti-encryption tradition. But it’s a bad tradition.

Like many career prosecutors, Deputy Attorney General Rod Rosenstein is pretty sure he’s more committed to upholding the U.S. Constitution and the rule of law than most of the rest of us are. This was the thrust of Rosenstein’s recent October 10 remarks on encryption, delivered to an audience of midshipmen at the U.S. Naval Academy.

The most troubling aspect of Rosenstein’s speech was his insistence that, while the government’s purposes in defeating encryption are inherently noble, the motives of companies that provide routine encryption and other digital-security tools (the way Apple, Google and other successful companies now do) are inherently selfish and greedy.

At the same time, Rosenstein said those who disagree with him on encryption policy as a matter of principle—based on decades of grappling with the public-policy implications of using strong encryption versus weak encryption or no encryption—are “advocates of absolute privacy.” (We all know that absolutism isn’t good, right?)

In his address, Rosenstein implied that federal prosecutors are devoted to the U.S. Constitution in the same way that Naval Academy students are:

“Each Midshipman swears to ‘support and defend the Constitution of the United States against all enemies, foreign and domestic.’ Our federal prosecutors take the same oath.”

Of course, he elides the fact that many who differ with his views on encryption—including yours truly, as a lawyer licensed in three jurisdictions—have also sworn, multiple times, to uphold the U.S. Constitution. What’s more, many of the constitutional rights we now regard as sacrosanct, like the Fifth Amendment privilege against self-incrimination, were only vindicated over time under our rule of law—frequently in the face of overreaching by law-enforcement personnel and federal prosecutors, all of whom also swore to uphold the Constitution.

The differing sides of the encryption policy debate can’t be reduced to supporting or opposing the rule of law and the Constitution. But Rosenstein chooses to characterize the debate this way because, as someone whose generally admirable career has been entirely within government, and almost entirely within the U.S. Justice Department, he simply never attempted to put himself in the position of those with whom he disagrees.

As I’ve noted, Rosenstein’s remarks draw on a long tradition. U.S. intelligence agencies, together with the DOJ and the FBI, reflexively resorted to characterizing their opponents in the encryption debate as fundamentally mercenary (if they’re companies) or fundamentally unrealistic (if they’re privacy advocates). In Steven Levy’s 2001 book Crypto, which documented the encryption policy debates of the 1980s and 1990s, he details how the FBI framed the question for the Clinton administration:

“What if your child is kidnapped and the evidence necessary to find and rescue your child is unrecoverable because of ‘warrant-proof’ encryption?”

The Clinton administration’s answer—deriving directly from George H.W. Bush-era intelligence initiatives—was to try to create a government standard built around a special combination of encryption hardware and software, labeled “the Clipper Chip” in policy shorthand. If the U.S. government endorsed a high-quality digital-security technology that also was guaranteed not to be “warrant-proof”—that allowed special access to government agents with a warrant—the administration asserted this would provide the appropriate “balance” between privacy guarantees and the rule of law.

But, as Levy documents, the government’s approach in the 1990s raised just as many questions then as Rosenstein’s speech raises now. Levy writes:

“If a crypto solution was not global, it would be useless. If buyers abroad did not trust U.S. products with the [Clipper Chip] scheme, they would eschew those products and buy instead from manufacturers in Switzerland, Germany, or even Russia.”

The United States’ commitment to rule of law also raised questions about how much our legal system should commit itself to enabling foreign governments to demand access to private communications and other data. As Levy asked at the time:

“Should the United States allow access to stored keys to free-speech—challenged nations like Singapore, or China? And would France, Egypt, Japan, and other countries be happy to let their citizens use products that allowed spooks in the United States to decipher conversations but not their own law enforcement and intelligence agencies?”

Rosenstein attempts to paint over this problem by pointing out that American-based technology companies have cooperated in some respects with other countries’ government demands—typically over issues like copyright infringement or child pornography rather than digital-security technologies like encryption. “Surely those same companies and their engineers could help American law enforcement officers enforce court orders issued by American judges, pursuant to American rule of law principles,” he says.

Sure, American companies, like companies everywhere, have complied as required with government demands designed to block content deemed in illegal in the countries where they operate. But demanding these companies meet content restrictions—which itself at times also raises international rule-of-law issues—is a wholly separate question from requiring companies to enable law-enforcement everywhere to obtain whatever information they want regarding whatever you do on your phone or on the internet. This is particularly concerning when it comes to foreign governments’ demands for private content and personal information, which might include providing private information about dissidents in unfree or “partly free” countries whose citizens must grapple with oppressive regimes.

Technology companies aren’t just concerned about money—it’s cheaper to exclude digital security measures than to invent and install new ones (such as Apple’s 3D-face-recognition technology set to be deployed in its new iPhone X). Companies do this not just to achieve a better bottom line but also to earn the trust of citizens. That’s why Apple resists pressure, both from foreign governments and from the U.S. government, to develop tools that governments—and criminals—could use to turn my iPhone against me. This matters even more in 2017 and beyond—because no matter how narrowly a warrant or wiretap order is written, access to my phone and other digital devices is access to more or less everything in my life. The same is true for most other Americans these days.

Rosenstein is certainly correct to have said “there is no constitutional right to sell warrant-proof encryption”—but there absolutely is a constitutional right to write computer software that encrypts my private information so strongly that government can’t decrypt it easily. (Or at all.) Writing software is generally understood to be presumptively protected expression under the First Amendment. And, of course, one needn’t sell it—many developers of encryption tools have given them away for free.

What’s more, our government’s prerogative to seek information pursuant to a court-issued order or warrant has never been understood to amount to a “constitutional right that every court order or search warrant be successful.” It’s common in our law-enforcement culture—of which Rosenstein is unquestionably a part and partisan—to invert the meaning of the Constitution’s limits on what our government can do, so that that law-enforcement procedures under the Fourth and Fifth Amendments are interpreted as a right to investigatory success.

We’ve known this aspect of the encryption debate for a long time, and you don’t have to be a technologist to understand the principle involved. Levy quotes Jerry Berman, then of the Electronic Frontier Foundation and later the founder of the Center for Democracy and Technology, on the issue: “The idea that government holds the keys to all our locks, even before anyone has been accused of committing a crime, doesn’t parse with the public.”

As Berman bluntly sums it up, “It’s not America.”

Mike Godwin (@sfmnemonic) is a distinguished senior fellow at the R Street Institute.

Filed Under: crypto wars, doj, encryption, going dark, privacy, rod rosenstein, rule of law

The Cyber World Is Falling Apart And The DOJ Is Calling For Weakened Encryption

from the better-for-cops,-worse-for-everyone-else dept

It seemed like the (mostly) one-man War on Encryption had reached a ceasefire agreement when “Going Dark” theorist James Comey was unceremoniously ejected from office for failing to pledge allegiance to the new king president. But it had barely had time to be relegated to the “Tired” heap before Deputy Attorney General Rod Rosenstein resurrected it.

Rosenstein has been going from cybersecurity conference to cybersecurity conference raising arguments for encryption before dismissing them entirely. His remarks have opened with the generally awful state of cybersecurity at both the public and private levels. He says encryption is important, especially when there are so many active security threats. Then he undermines his own arguments by calling for “responsible encryption” — a euphemism for weakened encryption that provides law enforcement access to locked devices and communications on secured platforms.

Considering recent events, this isn’t the direction the DOJ should be pushing. Russian hackers used a popular antivirus software to liberate NSA exploits from a contractor’s computer. Equifax exposed the data of millions of US citizens who never asked to be tracked by the service in the first place. Yahoo just admitted everyone who ever signed up for its email service was affected by a years-old security breach. Ransomware based on NSA malware wreaked havoc all over the world. These are all issues Rosenstein has touched on during his remarks. But they’re swiftly forgotten by the Deputy Attorney General when his focus shifts to what he personally — representing US law enforcement — can’t access because of encryption.

DAG Rosenstein needs to pay more attention to the first half of his anti-encryption stump speeches, as Matthew Green points out at Slate:

[A]ny technology that allows U.S. agencies to lawfully access data will present an irresistible target for hackers and foreign intelligence services. The idea that such data will remain safe is laughable in a world where foreign intelligence services have openly leveraged cyberweapons against corporate and political targets. In his speech, Rosenstein claims that the “master keys” needed to enable his proposal can be kept safe, but his arguments are contradicted by recent history. For example, in 2011 hackers managed to steal the master keys for RSA’s SecurID authentication product—and then used those keys to break into a slew of defense contractors. If we can’t secure the keys that protect top-secret documents, it’s hard to believe we’ll do better for your text messages.

Rosenstein is steering everyone towards his new term “responsible encryption” but there’s nothing responsible about creating a set of encryption keys for lawful access. It may not necessarily be a backdoor — a term Rosenstein is trying hard to distance himself from — but it is a hole that wouldn’t otherwise exist. And if keys are created and stored by manufacturers and platform providers, the chance malicious hackers can find them will always remain above 0%.

Filed Under: cybersecurity, encryption, going dark, hacks, matthew green, rod rosenstein, security

White House Cyber Security Boss Also Wants Encryption Backdoors He Refuses To Call Backdoors

from the torturing-words dept

Deputy Attorney General Rod Rosenstein recently pitched a new form of backdoor for encryption: “responsible encryption.” The DAG said encryption was very, very important to the security of the nation and its citizens, but not so important it should ever prevent warrants from being executed.

According to Rosenstein, this is the first time in American history law enforcement officers haven’t been able to collect all the evidence they seek with warrants. And that’s all the fault of tech companies and their perverse interest in profits. Rosenstein thinks the smart people building flying cars or whatever should be able to make secure backdoors, but even if they can’t, maybe they could just leave the encryption off their end of the end-to-end so cops can have a look-see.

This is the furtherance of former FBI director James Comey’s “going dark” dogma. It’s being practiced by more government agencies than just the DOJ. Calls for backdoors echo across Europe, with every government official making them claiming they’re not talking about backdoors. These officials all want the same thing: a hole in encryption. All that’s really happening is the development of new euphemisms.

Rob Joyce, the White House cybersecurity coordinator, is the latest to suggest the creation of encryption backdoors — and the latest to claim the backdoor he describes is not a backdoor. During a Q&A at Cyber Summit 2017, Joyce said this:

[Encryption is] “definitely good for America, it’s good for business, it’s good for individuals,” Joyce said. “So it’s really important that we have strong encryption and that’s available.”

Every pitch against secure encryption begins exactly like this: a government official professing their undying appreciation for security. And like every other pitch, the undying appreciation is swiftly smothered by follow-up statements specifying which kinds of security they like.

“The other side of that is there are some evil people in this world, and the rule of law needs to proceed, and so what we’re asking for is for companies to consider how they can support legal needs for information. Things that come from a judicial order, how can they be responsive to that, and if companies consider from the outset of building a platform or building a capability how they’re going to respond to those inevitable asks from a judge’s order, we’ll be in a better place.”

In other words, Joyce loves the security encrypted devices provide. But he’d love them more if they weren’t quite so encrypted. Perhaps if the manufacturers held the keys… The same goes for encrypted communications. Wonderful stuff. Unless the government has a warrant. Then it should be allowed to use its golden key or backdoor or whatever to gain access.

Once again, a government official asks for a built-in backdoor, but doesn’t have the intellectual honesty to describe it as such, nor the integrity to take ownership of the collateral damage. Neither the White House nor Congress seem interested in encryption bans or mandated backdoors. The officials talking about the “going dark” problem keep hinting tech companies should just weaken security for the greater good — with the “greater good” apparently benefiting only government agencies.

This way, when everything goes to hell, officials can wash their hands of the collateral blood because there’s no mandate or legislation tech companies can point to as demanding they acquiesce to the government’s desires. Officials like Joyce and Rosenstein want all of the access, but none of the responsibility. And every single person offering these arguments think the smart guys should do all the work and carry 100% of the culpability. Beyond being stupid, these arguments are disingenuous and dangerous. And no one making them seems to show the slightest bit of self-awareness.

Filed Under: cybersecurity, doj, responsible encryption, rob joyce, rod rosenstein

DOJ Continues Its Push For Encryption Backdoors With Even Worse Arguments

from the let-us-save-you-from-your-security dept

Early last week, the Deputy Attorney General (Rod Rosenstein) picked up the recently-departed James Comey’s Torch of Encroaching Darkness +1 and delivered one of the worst speeches against encryption ever delivered outside of the UK.

Rosenstein apparently has decided UK government officials shouldn’t have a monopoly on horrendous anti-encryption arguments. Saddling up his one-trick pony, the DAG dumped out a whole lot of nonsensical words in front of a slightly more receptive audience. Speaking at the Global Cyber Security Summit in London, Rosenstein continued his crusade against encryption using counterintuitive arguments.

After name-dropping his newly-minted term — responsible encryption™ — Rosenstein stepped back to assess the overall cybersecurity situation. In short, it is awful. Worse, perhaps, than Rosenstein’s own arguments. Between the inadvertently NSA-backed WannaCry ransomware, the Kehlios botnet, dozens of ill-mannered state actors, and everything else happening seemingly all at once, the world’s computer users could obviously use all the security they can get.

Encryption is key to security. Rosenstein agrees… up to a point. He wants better security for everyone, unless those everyones are targeted by search warrants. Then they have too much encryption.

Encryption is essential. It is a foundational element of data security and authentication. It is central to the growth and flourishing of the digital economy. We in law enforcement have no desire to undermine encryption.

But “warrant-proof” encryption poses a serious problem.

Well, you can’t really have both secure encryption and law enforcement-friendly encryption. Rosenstein knows this just as surely as Comey knew it. That didn’t stop Comey from pretending it was all about tech company recalcitrance. The same goes for Rosenstein who, early on in his speech, plays a shitty version of Sympathy for the Tech Devil by using the phrase “competitive forces” as a stand-in for “profit seeking” when speaking about the uptick in default encryption.

The underlying message of his last speech was that American tech companies should spurn profits for helping out the government by unwrapping one end of end-to-end encryption. The same pitch is made here, softened slightly in the lede thanks to the presence of UK tech companies in the audience. The language may be less divisive, but the arguments are no less stupid this time around.

In the United States, when crime is afoot, impartial judges are responsible for balancing a citizen’s reasonable expectation of privacy against the interests of law enforcement. The law recognizes that legitimate law enforcement needs can outweigh personal privacy concerns. That is how we obtain search warrants for homes and court orders to require witnesses to testify.

Warrant-proof encryption overrides our ability to balance privacy and security. Our society has never had a system where evidence of criminal wrongdoing was impervious to detection by officers acting with a court-authorized warrant. But that is the world that technology companies are creating.

I’m not sure what this “system” is Rosenstein speaks about, but there has always been evidence that’s eluded the grasp of law enforcement. Prior to common telephone use, people still communicated criminal plans but no one insisted citizens hold every conversation within earshot of law enforcement. Even in a digital world, evidence production isn’t guaranteed, even when encryption isn’t a factor.

Going on from there, the rest of speech is pretty much identical to his earlier one. In other words: really, really bad and really, really wrong.

Rosenstein believes the government should be able to place its finger on the privacy/security scale without being questioned or stymied by lowly citizens or private companies. Even if he’s right about that (he isn’t), he’s wrong about the balance. This isn’t privacy vs. security. This is security vs. insecurity. For a speech so front-loaded with tales of security breaches and malicious hacking, the back end is nothing more than bad arguments for weakened encryption — something the government may benefit from, but will do nothing to protect people from malicious hackers or malicious governments.

All the complaints about a skewed balance are being presented by an entity that’s hardly a victim. Electronic devices — particularly cellphones — generate an enormous amount of data that’s not locked behind encryption. The government can — without a warrant — track your movements, either post-facto, or with some creative paperwork, in real time. Tons of other “smart” devices are generating a wealth of records only a third party and a subpoena away. And that’s just the things citizens own. This says nothing about the wealth of surveillance options already deployed by the government and those waiting in the wings for the next sell off of civil liberties

It also should be noted Rosenstein is trying to make “responsible encryption” a thing. He obviously wants the word “backdoor” erased from the debate. While it’s tempting to sympathize with Rosenstein’s desire to take a loaded word out of the encryption debate lexicon, the one he’s replacing it with is worse. As Rob Graham at Errata Security points out, the new term is loaded language itself, especially when attached to Rosenstein’s bullshit metric: “measuring success in prevented crimes and saved lives.”

I feel for Rosenstein, because the term “backdoor” does have a pejorative connotation, which can be considered unfair. But that’s like saying the word “murder” is a pejorative term for killing people, or “torture” is a pejorative term for torture. The bad connotation exists because we don’t like government surveillance. I mean, honestly calling this feature “government surveillance feature” is likewise pejorative, and likewise exactly what it is that we are talking about.

Then there’s the problem with Rosenstein deploying rhetorical dodges in his discussions about encryption, which presumably include a number of government officials. Alex Gaynor, who worked for the United States Digital Service and participated in the Obama Administration’s discussion of potential encryption backdoors, points out Rosenstein’s abuse of his position.

Mr. Rosenstein plainly wants to reopen the “going dark” debate that began under the previously administration, spearheaded by FBI Director Jim Comey. While I disagree vehemently with him, it’s a valid policy position – and I have every reason to believe him that there are investigations in which encryption does hamper the Justice Department and FBI’s ability to investigate. However, he is not entitled to mislead the public in order to make that point. And make no mistake. Attempting to use the spectre of familiar computer security challenges in order to make the argument that his policy is necessary, even though his policy has nothing to do with these challenges, is the height of intellectual dishonesty.

There’s an endgame to Rosenstein’s dishonest rhetoric. And it won’t be tech companies being guilted into participating in his “responsible encryption” charade. It will be backdoors. And they will be legislated.

The Deputy Attorney General says that he is interested in “frank discussion”. However, his actual remarks demonstrate he is interested in anything but — his goal is to secure legislation akin to CALEA for your cellphone, and he doesn’t care who he has to mislead to accomplish this. Mr. Deputy Attorney General, I expect better.

This is what the DOJ wants. But Rosenstein is too weak-willed to say it out loud. So he spouts this contradictory, misleading, wholly asinine garbage to whatever audience will have him. Rosenstein is obtuse enough to be dangerous. Fortunately, most legislators (so far) seem unwilling to sacrifice the security of citizens on the altar of lawful access.

Filed Under: backdoors, doj, going dark, james comey, nerd harder, responsible encryption, rod rosenstein