Barr's Motives, Encryption and Protecting Children; DOJ 230 Workshop Review, Part III (original) (raw)
from the don't-break-the-internet dept
In Part I of this series on the Department of Justice’s February 19 workshop, “Section 230 — Nurturing Innovation or Fostering Unaccountability?” (archived video and agenda), we covered why Section 230 is important, how it works, and how panelists proposed to amend it. Part II explored Section 230’s intersection with criminal law.
Here, we ask what DOJ’s real objective with this workshop was. The answer to us seems clear: use Section 230 as a backdoor for banning encryption — a “backdoor to a backdoor” — in the name of stamping out child sexual abuse material (CSAM) while, conveniently, distracting attention from DOJ’s appalling failures to enforce existing laws against CSAM. We conclude by explaining how to get tough on CSAM to protect kids without amending Section 230 or banning encryption.
Banning Encryption
In a blistering speech, Trump’s embattled Attorney General, Bill Barr, blamed the 1996 law for a host of ills, especially the spread of child sexual abuse material (CSAM). But he began the speech as follows:
[Our] interest in Section 230 arose in the course of our broader review of market-leading online platforms, which we announced last summer. While our efforts to ensure competitive markets through antitrust enforcement and policy are critical, we recognize that not all the concerns raised about online platforms squarely fall within antitrust. Because the concerns raised about online platforms are often complex and multi-dimensional, we are taking a holistic approach in considering how the department should act in protecting our citizens and society in this sphere.
In other words, the DOJ is under intense political pressure to “do something” about “Big Tech” — most of all from Republicans, who have increasingly fixated on the idea that “Big Tech” is the new “Liberal Media” out to get them. They’ve proposed a flurry of bills to amend Section 230 — either to roll back its protections or to hold companies hostage, forcing them to do things that really have nothing to do with Section 230, like be “politically neutral” (the Hawley bill) or ban encryption (the Graham-Blumenthal bill), because websites and Internet services simply can’t operate without Section 230’s protections.
Multiple news reports have confirmed our hypothesis going into the workshop: that its purpose was to tie Section 230 to encryption. Even more importantly, the closed-door roundtable after the workshop (to which we were, not surprisingly, not invited) reportedly concluded with a heated discussion of encryption, after the DOJ showed participants draft amendments making Section 230 immunity contingent on compromising encryption by offering a backdoor to the U.S. government. Barr’s speech said essentially what we predicted he would say right before the workshop:
Technology has changed in ways that no one, including the drafters of Section 230, could have imagined. These changes have been accompanied by an expansive interpretation of Section 230 by the courts, seemingly stretching beyond the statute’s text and original purpose. For example, defamation is Section 230’s paradigmatic application, but Section 230 immunity has been extended to a host of additional conduct — from selling illegal or faulty products to connecting terrorists to facilitating child exploitation. Online services also have invoked immunity even where they solicited or encouraged unlawful conduct, shared in illegal proceeds, or helped perpetrators hide from law enforcement. …
Finally, and importantly, Section 230 immunity is relevant to our efforts to combat lawless spaces online. We are concerned that internet services, under the guise of Section 230, can not only block access to law enforcement — even when officials have secured a court-authorized warrant — but also prevent victims from civil recovery. This would leave victims of child exploitation, terrorism, human trafficking, and other predatory conduct without any legal recourse. Giving broad immunity to platforms that purposefully blind themselves ? and law enforcers ? to illegal conduct on their services does not create incentives to make the online world safer for children. In fact, it may do just the opposite.
Barr clearly wants to stop online services from “going dark” through Section 230 — even though Section 230 has little (if any) direct connection to encryption. His argument was clear: Section 230 protections shouldn’t apply to services that use strong encryption. That’s precisely what the Graham-Blumenthal EARN IT Act would do: greatly lower the bar for enforcement of existing criminal laws governing child sexual abuse material (CSAM), allow state prosecutions, and civil lawsuits (under a lower burden of proof), but then allow Internet services to “earn” back their Section 230 protection against this increased liability by doing whatever a commission convened and controllled by the Attorney General tells them to do.
Those two Senators are expected to formally introduce their bill in the coming weeks. Undoubtedly, they’ll refer back to Barr’s speech, claiming that law enforcement needs their bill passed ASAP to “protect the children.”
Barr’s speech on encryption last July didn’t mention 230 but went much further in condemning strong encryption. If you read it carefully, you can see where Graham and Blumenthal got their idea of lowering the standard of existing federal law on CSAM from “actual knowledge” to “recklessness, which would allow the DOJ to sue websites that offer stronger encryption than the DOJ thinks is really necessary. Specifically, Barr said:
The Department has made clear what we are seeking. We believe that when technology providers deploy encryption in their products, services, and platforms they need to maintain an appropriate mechanism for lawful access. This means a way for government entities, when they have appropriate legal authority, to access data securely, promptly, and in an intelligible format, whether it is stored on a device or in transmission. We do not seek to prescribe any particular solution. …
We are confident that there are technical solutions that will allow lawful access to encrypted data and communications by law enforcement without materially weakening the security provided by encryption. Such encryption regimes already exist. For example, providers design their products to allow access for software updates using centrally managed security keys. We know of no instance where encryption has been defeated by compromise of those provider-maintained keys. Providers have been able to protect them. …
Some object that requiring providers to design their products to allow for lawful access is incompatible with some companies’ “business models.” But what is the business objective of the company? Is it “A” — to sell encryption that provides the best protection against unauthorized intrusion by bad actors? Or is it “B” — to sell encryption that assures that law enforcement will not be able to gain lawful access? I hope we can all agree that if the aim is explicitly “B” — that is, if the purpose is to block lawful access by law enforcement, whether or not this is necessary to achieve the best protection against bad actors — then such a business model, from society’s standpoint, is illegitimate, and so is any demand for that product. The product jeopardizes the public’s safety, with no countervailing utility. …
The real question is whether the residual risk of vulnerability resulting from incorporating a lawful access mechanism is materially greater than those already in the unmodified product. The Department does not believe this can be demonstrated.
In other words, companies choosing to offer encryption should have to justify their decision to do so, given the risks created by denying law enforcement access to user communications. That’s pretty close to a “recklessness” standard.
Again, for more on this, read Berin’s previous Techdirt piece. According to the most recently leaked version of the Graham-Blumenthal bill, the Attorney General would no longer be able to rewrite the “best practices” recommended by the Commission. But he would gain greater ability to steer the commission by continually vetoing its recommendations until it does what he wants. If the commission doesn’t make a recommendation, the safe harbor offered by complying with the “best practices” doesn’t go into effect — but the rest of the law still would. Specifically, website and Internet service operators would still face vague new criminal and civil liability for “reckless” product design. The commission and its recommendations are a red herring; the truly coercive aspects of the bill will happen regardless of what the commission does. If the DOJ signals that failing to offer a backdoor (or retain user data) will lead to legal liability, companies will do it — even absent any formalized “best practices.”
The Real Scandal: DOJ’s Inattention to Child Sexual Abuse
As if trying to compromise the security of all Internet services and the privacy of all users weren’t bad enough, we suspect Barr had an even more devious motive: covering his own ass, politically.
Blaming tech companies generally and encryption in particular for the continued spread of CSAM kills two birds with one stone. Not only does it offer them a new way to ban encryption, it also deflects attention from the real scandal that should appall us all: the collective failure of Congress, the Trump Administration, and the Department of Justice to prioritize the fight against the sexual exploitation of children.
The Daily, The New York Times podcast, ran part one of a two-part series on this topic on Wednesday. Reporters Michael Keller and Gabriel Dance summarized a lengthy investigative report they published back in September, but which hasn’t received the attention it deserves. Here’s the key part:
The law Congress passed in 2008 foresaw many of today’s problems, but The Times found that the federal government had not fulfilled major aspects of the legislation.
The Justice Department has produced just two of six required reports that are meant to compile data about internet crimes against children and set goals to eliminate them, and there has been a constant churn of short-term appointees leading the department’s efforts. The first person to hold the position, Francey Hakes, said it was clear from the outset that no one “felt like the position was as important as it was written by Congress to be.”
The federal government has also not lived up to the law’s funding goals, severely crippling efforts to stamp out the activity.
Congress has regularly allocated about half of the 60millioninyearlyfundingforstateandlocallawenforcementefforts.Separately,theDepartmentofHomelandSecuritythisyeardivertednearly60 million in yearly funding for state and local law enforcement efforts. Separately, the Department of Homeland Security this year diverted nearly 60millioninyearlyfundingforstateandlocallawenforcementefforts.Separately,theDepartmentofHomelandSecuritythisyeardivertednearly6 million from its cybercrimes units to immigration enforcement — depleting 40 percent of the units’ discretionary budget until the final month of the fiscal year.
So, to summarize:
- Congress has spent has half as much as it promised to;
- DOJ hasn’t bothered issuing reports required by law — the best way to get lawmakers to cough up promised funding; and
- The Trump Administration has chosen to spend money on the political theatre of immigration enforcement rather than stopping CSAM trafficking.
Let that sink in. In a better, saner world, Congress would be holding hearings to demand explanations from Barr. But they haven’t, and the workshop will allow Barr to claim he’s getting tough on CSAM without actually doing anything about it — while also laying the groundwork for legislation that would essentially allow him to ban encryption.
Even for Bill Barr, that’s pretty low.
Filed Under: cda 230, congress, csam, doj, encryption, funding, section 230, william barr