section 230 repeal – Techdirt (original) (raw)
The Plan To Sunset Section 230 Is About A Rogue Congress Taking The Internet Hostage If It Doesn’t Get Its Way
from the the-beatings-will-continue-until-the-internet-improves dept
If Congress doesn’t get Google and Meta to agree to Section 230 reforms, it’s going to destroy the rest of the open internet, while Google and Meta will be just fine. If that sounds stupidly counterproductive, well, welcome to today’s Congress.
As we were just discussing, the House Energy and Commerce committee is holding a hearing on the possibility of sunsetting Section 230 at the end of next year. This follows an earlier hearing from last month where representatives heard such confusing nonsense about Section 230 that it was actively misrepresenting reality.
But, based on that one terribly misleading hearing, the top Republican (Cathy McMorris Rodgers) and Democrat (Frank Pallone) on the committee created this bill to sunset the law, along with a nearly facts-free op-ed in the Wall Street Journal making a bunch of blatantly false claims about Section 230. In writing about that bill, I complained that it was ridiculous that neither representative could bother to walk down the hall to talk to Senator Wyden, who coauthored Section 230 and could explain to Rodgers and Pallone their many factual errors.
As I said in last week’s Ctrl-Alt-Speech podcast, they were basically holding a gun to the head of the internet and saying that if Google and Facebook didn’t come up with a deal to appease Congress, Congress would shoot the internet dead.
Now, Wyden and his Section 230 co-author, former Rep. Chris Cox, have penned their own WSJ op-ed that basically makes the same point, with the brilliant title: Buy This Legislation or We’ll Kill the Internet. Because that’s exactly what this “sunset” bill is about. It’s demanding that “big tech” (Meta and Google) come up with a plan to appease Congress, or Congress will effectively kill the internet, by making it nearly impossible for smaller sites to exist.
Just one of the many nonsensical points of this plan is why Rodgers and Pallone think that Meta and Google’s interests are aligned with the wider internet, its users, and smaller sites. There are tons of other sites on the internet that would be way more damaged by removing Section 230.
But Cox and Wyden are pretty clear in pointing out just how wrong all this is. They highlight this trope of threatening to kill something if someone doesn’t get their way:
A 1973 National Lampoon cover featured a dog with a gun to its head. The headline: “If You Don’t Buy This Magazine, We’ll Kill This Dog.” The image is reminiscent of how Congress approaches its most serious responsibilities
This is tragically true. It’s how Congress has handled the debt ceiling for many years now. It’s how Congress has dealt with reform (or, really, lack thereof) of our deeply flawed surveillance system. But, it’s extra ridiculous to have it happen here.
The latest such exercise will be on display at a House hearing on Wednesday, where members of both parties will threaten to repeal the clear-cut legal rules that for decades have governed millions of websites. The dog with a gun to its head is every American who uses the internet.
The law in question is Section 230 of the 1996 Communications Decency Act. The statute provides that the person who creates content online is legally responsible for it and that websites aren’t liable for efforts to moderate their platforms to make them more welcoming, useful or interesting.
Or, as Prof. Eric Goldman (taking inspiration from the opening paragraph in the Wyden/Cox op-ed) made in meme form:
As Cox and Wyden make clear, the framework of Section 230 is entirely sensible, but only if you actually bother to read it and understand it:
When we introduced this legislation in 1995, when both of us served in the House, two things convinced our colleagues to endorse it almost unanimously.
The first was that the internet was different from traditional publishing. The equation had been flipped. We weren’t dealing with millions of people watching a television network’s production, or subscribers reading a newspaper. Publishing and broadcasting tools were suddenly free or nearly so, offering a microphone to millions of Americans who wouldn’t have the power, clout or fame to be featured on NBC’s “Meet the Press” or in Time magazine.
The second was that without new legislation, the law perversely penalized content moderation. Under the old rules of publisher liability, only an “anything goes” approach would protect a website from legal responsibility for user-created content. Prohibiting bullying, swearing, harassment, and threats of violence could be legally disastrous for any site. It was clear, then as now, that if the law were to encourage such a hands-off approach, the internet would turn into a cesspool.
It’s important to remember this history when evaluating the merits of sunsetting Section 230,as the House proposal intends. According to the bill’s text, if Congress can’t agree on a successor to Section 230 by Dec. 31, 2025, websites from Yahoo and Etsy to the local restaurant hosting customer reviews will become liable for every syllable posted on the site by a user or troll. A single post can generate claims that run into the millions of dollars.
I might challenge the wording in that last paragraph a little bit (though I understand why it was written that way within the confines of a short op-ed). Without Section 230, sites don’t automatically become fully liable for content posted by users (some people assume this, incorrectly). Rather, their liability becomes an open question, subject to the results of litigation that is extremely costly whether or not it is later determined that the underlying post can reasonably generate a claim.
This is the part that often gets lost in this discussion. Without Section 230, it flings open the court doors for all sorts of vexatious litigation that is extraordinarily costly just to even determine if a site is liable in the first place. And when that happens, there is tremendous pressure on websites to do a few things. First is to simply remove any content that is at risk of a lawsuit (or when threatened by a lawsuit) just to avoid the costly legal fight that might ensue. So the removal of 230 gives people a kind of litigator’s veto: threaten a lawsuit and there’s a good chance the content gets removed.
The other thing is, if a site does get sued, the cost of defending the lawsuit becomes so high that many companies (and law firms and insurance companies) will push them to just settle. The cost of settling for a nuisance fee will often be significantly cheaper than fighting the full litigation, even if the website would have a high likelihood of winning in the end.
The problem without Section 230 is not the actual fear of liability. A lot of it is the cost of proving you shouldn’t be liable, which is orders of magnitude higher without Section 230. But this is also a big part of what critics of Section 230 do not understand (or, if they’re plaintiffs’ lawyers, they want that lever to use against websites).
As Wyden and Cox make clear:
Reverting to this pre-Section 230 status quo would dramatically alter, and imperil, the online world. Most platforms don’t charge users for access to their sites. In the brave new world of unlimited liability, will a website decide that carrying user-created content free of charge isn’t worth the risk? If so, the era of consumer freedom to both publish and view web content will come to a screeching halt.
It is very much a question of “if you don’t alter 230 in a way Congress likes, Congress will shoot the internet.”
It’s ridiculous that we’ve gotten to this point, and that the support for this destruction is effectively bipartisan. The underlying framing of this effort as the false belief that the biggest of the big tech companies, Google and Meta, are the only real stakeholders here is equally ridiculous.
As I’ve said over and over again, that’s not the case. Both of those companies have buildings full of lawyers. They, above anyone else, can shoulder the costs of these lawsuits. It’s all the other sites that cannot and will not.
At a time when it’s clear that Google and Meta are effectively fine with putting the open web into managed decline and building up their own walled gardens, removing Section 230 will accelerate that process. It will give the biggest internet companies that much more power, while harming everyone else, with it being felt most keenly by the end users who rely on other sites and services beyond Google and Meta.
So the metaphor here really seems to be Congress pointing a gun at the open internet and threatening to shoot it if Google and Meta (which do not represent the open internet) don’t dance to Congress’ tune. The whole situation is truly messed up.
Filed Under: cathy mcmorris rodgers, chris cox, frank pallone, ron wyden, section 230, section 230 repeal
Bipartisan Bill To Repeal Section 230 Defended In Facts-Optional Op-Ed
from the it-can-always-get-dumber dept
Section 230, the legal backbone of the internet, is under attack again. This time, it is from a bipartisan pair of legislators who seem to fundamentally misunderstand how the law works and what the consequences of repealing it would be.
We’ve talked about plenty of attempts to reform Section 230, and why all of them would be problematic. But, now we have a bipartisan attempt to repeal it outright. Republican Cathy McMorris Rodgers and Democrat Frank Pallone, who are the top Republican and Democrat on the House Energy & Commerce Committee, have announced a draft bill to “sunset” Section 230.
It’s perhaps no surprise that Rodgers and Pallone don’t understand Section 230. The one hearing they held on the matter was packed only with witnesses who hate Section 230, and included multiple witnesses who flat out had no clue what they were talking about, to an embarrassing degree.
When you are only informed by ignorance, the policy proposals you come out with are going to be ignorant.
The bill does nothing more than say that Section 230 no longer exists after December 31, 2025.
Section 230 of the Communications Act of 1934 (47 U.S.C. 230) is amended by adding at the end the following:
‘‘(g) SUNSET.—This section shall have no force or effect after December 31, 2025.’’.
Along with the draft of the bill, the pair published a facts-optional op-ed in the Wall Street Journal in support of it. But, if you don’t have a way around the paywall, they also published the full piece on the House Energy & Commerce Committee website as a “press release” (who knew the WSJ was willing to just rerun press releases?)
Let’s break down everything it gets wrong. Buckle up, it may take some time.
The internet’s original promise was to help people and businesses connect, innovate and share information. Congress passed the Communications Decency Act in 1996 to realize those goals. It was an overwhelming success. Section 230 of the act helped shepherd the internet from the “you’ve got mail” era into today’s global nexus of communication and commerce.
First off, no, this is wrong. The “Communications Decency Act” was passed in 1996 in the midst of a ridiculous moral panic about “the children online” in response to incendiary media reports that turned out to have been mostly fabricated. The whole point of the Communications Decency Act was not to “help businesses connect, innovate, and share information. It was to allow gullible politicians to grandstand about how they were “protecting the children online.”
Of course, the whole thing turned out to be nonsense, and nearly all of the Communications Decency Act was tossed out as unconstitutional.
What is now known as Section 230, written by then Reps. Chris Cox and Ron Wyden, was designed to be an alternative, better approach in response to the nonsense moral panic and to overturn the dangerous decision in Stratton Oakmon v. Prodigy, which said that Prodigy’s attempt to moderate its forums to make them family friendly meant that it could be held liable for anything it didn’t take down. It only got tacked on to the larger CDA later on.
Cox and Wyden, unlike Rodgers and Pallone, were smart enough to recognize that if you wanted to keep any parts of the internet “family friendly,” you had to make sure that sites would not be liable. Otherwise, why would companies risk hosting any content at all?
Further, unlike Rodgers and Pallone, Cox and Wyden understood the First Amendment well enough to know that distributor liability would only come with actual knowledge, and thus, if any site did decide it was worth the hassle to host third-party content, they would almost certainly do so with explicit plans to never bother to look at the content and never moderate, because without knowledge and by not moderating, they’d have the lack of knowledge protect them from any liability.
The end result is a disaster: fewer sites willing to host content, and the few that do not willing to do any moderation at all for fear of liability. Certainly nothing family friendly, as that would risk absolutely ruinous litigation.
This wasn’t meant to just be there for the beginning of the internet. It was meant for the long haul. I realize it’s on the other side of the Capitol, but Rodgers and Pallone could make the trek over to Wyden’s office to ask him (they have those nice underground trains for members in the Capitol which makes it speedy).
Or just read what Cox and Wyden filed a few years ago to debunk some of the myths about Section 230, like the idea it was only meant for the early years:
Section 230, originally named the Internet Freedom and Family Empowerment Act, H.R. 1978, was designed to address the obviously growing problem of individual web portals being overwhelmed with user-created content. This is not a problem the internet will ever grow out of; as internet usage and content creation continue to grow, the problem grows ever bigger. Far from wishing to offer protection to an infant industry, our legislative aim was to recognize the sheer implausibility of requiring each website to monitor all of the user-created content that crossed its portal each day.
Critics of Section 230 point out the significant differences between the internet of 1996 and today. Those differences, however, are not unanticipated. When we wrote the law, we believed the internet of the future was going to be a very vibrant and extraordinary opportunity for people to become educated about innumerable subjects, from health care to technological innovation to their own fields of employment. So we began with these two propositions: let’s make sure that every internet user has the opportunity to exercise their First Amendment rights; and let’s deal with the slime and horrible material on the internet by giving both websites and their users the tools and the legal protection necessary to take it down.
There’s more, but it really seems like Rodgers and Pallone should have maybe understood that before insisting the intent was the opposite.
Anyway, back to this nonsense “press release”/WSJ op-ed:
Unfortunately, Section 230 is now poisoning the healthy online ecosystem it once fostered. Big Tech companies are exploiting the law to shield them from any responsibility or accountability as their platforms inflict immense harm on Americans, especially children. Congress’s failure to revisit this law is irresponsible and untenable. That is why we’re taking bipartisan action.
This is the same kind of nonsense that went into the original Rimm Report that inspired Senator James Exon to push the unconstitutional CDA in the first place.
Anyone who believes that “big tech” is “exploiting” Section 230 “to shield them from any responsibility or accountability as their platforms inflict immense harm on Americans, especially children” is either not paying attention or is lying. I’m not sure which is worse.
Every “big tech” platform has a large team of people who work on trust & safety, trying to keep platforms safe for everyone. Most have efforts dedicated specifically to protecting children online.
I know that there are media reports claiming that the tech companies don’t care, but that’s silly. If you talk to people at these companies who work in these roles, and look at all of the tools and features they roll out, you would know that’s not true.
And Section 230 doesn’t “shield” them from any kind of responsibility or accountability. Indeed, any time anything bad happens to a child, loosely connected to their online presence, there are tons of media stories criticizing the companies, which can lead to diminished usage and advertisers bailing on the platforms. On top of that, the app stores enforce way more stringent safety measures than the law requires.
These companies know full well that harming children is simply bad for business. It’s ridiculous for sitting members of Congress to pretend otherwise.
We must act because Big Tech is profiting from children, developing algorithms that push harmful content on to our kids’ feeds and refusing to strengthen their platforms’ protections against predators, drug dealers, sex traffickers, extortioners and cyberbullies. Children are paying the price, developing addictive and dangerous habits, often at the expense of their mental health. Big Tech has failed to uphold American democratic values and be fair stewards of the speech they host.
This is utter garbage devoid of any connection to reality. The companies are constantly working on ways to keep harmful content out of kids’ feeds for all the reasons discussed above. It’s bad for everyone. It’s bad for the kids. It’s bad for the families. It’s bad for business.
And you know what allows them to make sure the algorithms are trying to keep harmful content out of the feeds? Section 230! Because, again, without it, companies wouldn’t offer up any algorithms at all. This is because that would be moderating, and that would lead to the potential of knowledge, and with it, ruinous liability.
The claims about social media harming the mental health of kids have been debunked over and over and over and over and over again. It’s just sad that sitting members of Congress are still repeating them.
Furthermore, it’s not clear if Rodgers or Pallone has ever read Section 230. They talk about drug dealers, sex traffickers, and extortionists, leaving out that Section 230 has an exemption for federal criminal activity, and a separate (more recent) exemption for sex trafficking.
It seems pretty damn rich to say that Section 230 is protecting websites for sex trafficking when the law has already been explicitly changed to say that it doesn’t. Are Rodgers and Pallone just not aware of what the law says?
And, I mean, before we rush into repealing Section 230 whole cloth, maybe we should look at that amendment that was put in place with lots of moralizing about “protecting sex trafficking victims,” but which has been shown to have actually put women at tremendous risk.
It’s almost as if politicians can do real harm when they grandstand without understanding the underlying law.
Over the years lawmakers have tried to no avail to address these concerns, thanks in part to Big Tech’s refusal to engage in a meaningful way. Congress has made good-faith efforts to find a solution that preserves Big Tech’s ability to innovate and ensures safety and accountability for past and future harm. It’s time to make that a reality, which is why we are unveiling today bipartisan draft legislation to sunset Section 230.
Again, this is not true. They did pass FOSTA, and as most of the tech industry warned, it has resulted in a ton of harm and basically no benefit.
And, note the underlying premise: that Section 230 only protects “Big Tech” and that it’s “Big Tech” that is trying to stop any changes. That’s also a lie. FOSTA passed because Facebook embraced it.
“Big Tech” is absolutely willing to compromise on Section 230, because they know that all it does is play into their hands. It’s all the other sites that get screwed because of litigation and liability. Meta and Google and the other big tech companies have buildings full of lawyers. Removing Section 230 may harm them at the margins, but they’ll make up for it by having all the smaller competition wiped out.
Our measure aims to restore the internet’s intended purpose—to be a force for free expression, prosperity and innovation. It would require Big Tech and others to work with Congress over 18 months to evaluate and enact a new legal framework that will allow for free speech and innovation while also encouraging these companies to be good stewards of their platforms. Our bill gives Big Tech a choice: Work with Congress to ensure the internet is a safe, healthy place for good, or lose Section 230 protections entirely.
This is beyond delusional. This is Rodgers and Pallone putting a gun to the open internet and telling big tech “let’s make a deal or we kill the open internet.”
But, again, notice the only stakeholder they think matters here: “big tech.” The rest of us get fucked. Users. Small sites. All of us who rely on Section 230 to have spaces to talk get cut out of the deal, as Facebook and Google will absolutely cook up something that protects their interests and screws the rest of us.
But, even more importantly, note what Rodgers and Pallone are explicitly admitting in the paragraph above: they know that Section 230 enables “free speech and innovation.” Yet they’re willing to kill it to spite a few companies they don’t like. What sort of legislator does that?
The proposal would likewise ensure that social-media companies are held accountable for failing to protect our children. Tech companies currently enjoy immunity under Section 230 that applies to virtually no other industry. That makes no sense and must end. Traditional media outlets have long informed and entertained us without the same expansive legal shield. We need a solution that ensures accountability, keeps children safe, and levels the playing field so Big Tech is treated like other industries.
And… we’re back to just not understanding how any of this works. If you are holding companies liable for third-party speech, you run into all of the problems we describe above: fewer companies willing to host any speech, and the few that do either moderating so heavily as to make them useless, or looking the other way entirely to avoid the requisite knowledge.
It appears that Rodgers and Pallone don’t understand the first thing about how any of this works. Remove 230 and add in liability for anything bad that happens to kids, and you have no more open internet. What companies are going to host any content knowing the consequences if they miss something or — much more likely — they simply get blamed for something bad happening to kids that was only tangentially related to the site?
We have already seen tech companies sued for things almost entirely unrelated to content on their site, simply because someone had an account on the site. Currently, litigation is happening all over the place where people are trying to hold social media liable because of basically every harm you can think of.
What kind of person would look at those vexatious lawsuits and say “we need more of that!”?
Reforming Section 230 won’t “break the internet” or hurt free speech, as Big Tech warns. The First Amendment—not Section 230—is the basis for our free-speech protections in the U.S., and it won’t cease providing them even if Section 230 immunity no longer exists.
Again, this is screaming out about their own ignorance. Yes, the First Amendment can protect against some of the more egregious cases, but mounting a First Amendment defense takes way longer and is way, way, way more expensive. And that means that it becomes way more lucrative to file vexatious cases, because companies will choose to pay up nuisance fees to get the cases done with.
Section 230 helps protect against some of that by making it easier to kick such cases out of court earlier, rather than after many more years and much greater expense.
Rodgers and Pallone would likely respond that they don’t care how much it costs or how much time it takes because “Big Tech can afford it.” But again, that ignores that the biggest beneficiaries of Section 230 are not big tech, but anyone who has an interactive computer service that hosts third party content. And yes, this includes people who forward emails or retweet.
The ability to file SLAPP suits against people for retweets and forwarded emails would be a huge fucking mess.
Meaningful changes will ensure these companies are no longer able to hide behind a broadly interpreted law while they manipulate and profit from Americans’ free-speech protections. Updating Section 230 will empower parents, children and others who have been exploited by criminals, drug dealers and others on social-media platforms.
They don’t “hide behind the law.” They get mistargeted, vexatious lawsuits thrown out early.
Updating Section 230 won’t do shit in any of the examples described. It will empower a bunch of ambulance chasing lawyers to shake down tons of people and companies with threats of costly lawsuits. As for the “criminals” and “drug dealers,” what they do is already against the law and exempt from 230’s protections.
Sunsetting Section 230 will require Congress and stakeholders to create a solution that ensures accountability, protects innovation and free speech, and reflects the realities of the digital age. The internet can thrive while still being fair to companies and safe for all.
None of this is correct. All of it is inaccurate. Notice how many outright errors there were in this “op-ed”:
- Section 230 was not created to help just the early internet, but was seen as a permanent fix to an issue that the drafters knew would get worse over time.
- None of the hyperbolic claims about the “harms” are supported by the research and data.
- They don’t seem to realize that Section 230 already exempts criminal activity and has an explicit exemption for sex trafficking.
- They don’t seem to realize that there are other accountability measures beyond “the law,” including users and business partners fleeing.
- They don’t seem to understand how Section 230 works or how it intersects with the First Amendment.
- They don’t understand how vexatious litigants abuse the legal system and how this would help them.
- They don’t understand the nature of the First Amendment and the requirement for knowledge for there to be liability.
- They think the only stakeholders are big tech companies, most of whom have already conceded that they don’t need Section 230, knowing that the real harm will go to smaller competitors.
And on, and on, and on.
We see lots of frustratingly stupid takes on internet policy. But this has to be one of the absolute worst, stupidest examples of ignorant policymaking by the damn fools that we keep electing to Congress.
It’s truly disheartening to see such a fundamental misunderstanding of the issues at hand from those in positions of power. The consequences of their proposed actions would be devastating, not just for the tech industry, but for free speech and innovation as a whole.
Filed Under: 1st amendment, cathy mcmorris rodgers, fosta, frank pallone, free speech, moral panic, protect the children, seciton 230, section 230 repeal