Texas Court Gets It Right: Dumps Texas's Social Media Moderation Law As Clearly Unconstitutional (original) (raw)

from the nicely-done dept

Back in June we reported on how Florida’s social media moderation bill was tossed out as unconstitutional in a Florida court. The ruling itself was a little bit weird, but an easy call on 1st Amendment grounds. It was perhaps not surprising, but still stupid, to see Texas immediately step up and propose its own version of such a bill, which was signed in September. We again predicted that a court would quickly toss it out as unconstitutional.

And that’s exactly what has happened.

There was some whispering and concerns that Texas’ law was craftier than the Florida law, and parts of it might survive, but, nope. And this ruling is actually more thorough, and more clear than the slightly jumbled Florida ruling. It’s chock full of good quotes. The only thing that sucks about this ruling, honestly, is that Texas is definitely going to appeal it to the 5th Circuit court of appeals and the 5th Circuit is the craziest of Circuits and seems, by far, the most likely to ignore the basic 1st Amendment concepts in favor of some weird Trumpist political grandstanding.

However, for this brief shining moment, let’s celebrate a good, clean ruling that vindicates all the points many of us have been making about just how batshit crazy the Texas law was, and how it was so blatantly an infringement on the 1st Amendment rights of websites. There are a bunch of pages wasted on proving that the trade groups who brought the lawsuit have standing, which aren’t worth rehashing here beyond saying that, yes, trade groups for internet companies have the standing to challenge this law.

From there, the ruling gets down to the heart of the matter, and it’s pretty straight forward. Content moderation is the same thing as editorial discretion and that’s clearly protected by the 1st Amendment.

Social Media Platforms Exercise Editorial Discretion Protected by the First Amendment

Judge Robert Pitman cites all the key cases here — Reno v. ACLU (which tossed out all of the CDA — minus Section 230 — as unconstitutional, but also clearly established that the 1st Amendment applies to the internet), Sorrell v. IMS Health (establishing that the dissemination of information is speech) and, perhaps most importantly, Manhattan Cmty. Access v. Halleck, the Justice Brett Kavanaugh-authored ruling we’ve highlighted many times for making it quite clear that private internet companies are free to moderate however they see fit. It also cites the key case that was instrumental to the ruling in Florida: Miami Herald v. Tornillo, which made clear the 1st Amendment protections for editorial discretion:

Social media platforms have a First Amendment right to moderate content disseminated on their platforms. See Manhattan Cmty. Access Corp. v. Halleck, 139 S. Ct. 1921, 1932 (2019) (recognizing that ?certain private entities[] have rights to exercise editorial control over speech and speakers on their properties or platforms?). Three Supreme Court cases provide guidance. First, in Tornillo, the Court struck down a Florida statute that required newspapers to print a candidate?s reply if a newspaper assailed her character or official record, a ?right of reply? statute. 418 U.S. at 243. In 1974, when the opinion was released, the Court noted there had been a ?communications revolution? including that ?[n]ewspapers have become big business . . . [with] [c]hains of newspapers, national newspapers, national wire and news services, and one-newspaper towns [being] the dominant features of a press that has become noncompetitive and enormously powerful and influential in its capacity to manipulate popular opinion and change the course of events.? Id. at 248?49. Those concerns echo today with social media platforms and ?Big Tech? all the while newspapers are further consolidating and, often, dying out. Back to 1974, when newspapers were viewed with monopolistic suspicion, the Supreme Court concluded that newspapers exercised ?editorial control and judgment? by selecting the ?material to go into a newspaper,? deciding the ?limitations on the size and content of the paper,? and deciding how to treat ?public issues and public officials?whether fair or unfair.? Id. at 258. ?It has yet to be demonstrated how governmental regulation of this crucial process can be exercised consistent with First Amendment guarantees of a free press as they have evolved to this time.

There’s also a fun bit for all the very silly people who keep insisting that social media websites are “common carriers” which could subject them to certain restrictions. The court says “nope,” highlights how very different they are from common carriers, and moves on.

This Court starts from the premise that social media platforms are not common carriers. ?Equal access obligations . . . have long been imposed on telephone companies, railroads, and postal services, without raising any First Amendment issue.? United States Telecom Ass?n v. Fed. Commc?ns Comm?n, 825 F.3d 674, 740 (D.C. Cir. 2016). Little First Amendment concern exists because common carriers ?merely facilitate the transmission of speech of others.? Id. at 741. In United States Telecom, the Court added broadband providers to its list of common carriers. Id. Unlike broadband providers and telephone companies, social media platforms ?are not engaged in indiscriminate, neutral transmission of any and all users? speech.? Id. at 742. User-generated content on social media platforms is screened and sometimes moderated or curated. The State balks that the screening is done by an algorithm, not a person, but whatever the method, social media platforms are not mere conduits. According to the State, our inquiry could end here, with Plaintiffs not needing to prove more to show they engage in protected editorial discretion. During the hearing, the Court asked the State, ?[T]o what extent does a finding that these entities are common carriers, to what extent is that important from your perspective in the bill?s ability to survive a First Amendment challenge?? (See Minute Entry, Dkt. 47). Counsel for the State responded, ?[T]he common carriage doctrine is essential to the First Amendment challenge. It?s why it?s the threshold issue that we?ve briefed . . . . It dictates the rest of this suit in terms of the First Amendment inquiry.? (Id.). As appealing as the State?s invitation is to stop the analysis here, the Court continues in order to make a determination about whether social media platforms exercise editorial discretion or occupy a purgatory between common carrier and editor.

There’s also a short footnote totally dismissing the fact that the Texas bill, HB20, tries to just outright declare social media sites as common carriers. That’s not how any of this works.

HB 20?s pronouncement that social media platforms are common carriers… does not impact this Court?s legal analysis.

The judge briefly notes that social media is obviously different in many ways than newspapers, and that AI-based moderation is certainly a technological differentiator, but then brings it back around to basic principles: it’s still all editorial discretion.

This Court is convinced that social media platforms, or at least those covered by HB 20, curate both users and content to convey a message about the type of community the platform seeks to foster and, as such, exercise editorial discretion over their platform?s content.

In fact, Texas legislators’ and the governor’s own hubris helped sink this bill by admitting in the bill itself and in quotes about the bill, how this is all about editorial discretion.

Indeed, the text of HB 20 itself points to social media platforms doing more than transmitting communication. In Section 2, HB 20 recognizes that social media platforms ?(1) curate[] and target[] content to users, (2) place[] and promote[] content, services, and products, including its own content, services, and products, (3) moderate[] content, and (4) use[] search, ranking, or other algorithms or procedures that determine results on the platform.? Tex. Bus. & Com. Code ? 120.051(a)(1)?(4). Finally, the State?s own basis for enacting HB 20 acknowledges that social media platforms exercise editorial discretion. ?[T]here is a dangerous movement by social media companies to silence conservative viewpoints and ideas.? Governor Abbott Signs Law Protecting Texans from Wrongful Social Media Censorship, OFFICE OF THE TEX. GOVERNOR (Sept. 9, 2021), https://gov.texas.gov/news/post/governorabbott-signs-law-protecting-texans-from-wrongful-social-media-censorship. ?Texans must be able to speak without being censored by West Coast oligarchs.? Bryan Hughes (@SenBryanHughes), TWITTER (Aug. 9, 2021, 4:34 PM), https://twitter.com/SenBryanHughes/status/1424846466183487492 Just like the Florida law, a ?constant theme of [Texas] legislators, as well as the Governor . . . , was that the [platforms?] decisions on what to leave in or take out and how to present the surviving material are ideologically biased and need to be reined in.? NetChoice, 2021 WL 2690876, at *7. Without editorial discretion, social media platforms could not skew their platforms ideologically, as the State accuses of them of doing. Taking it all together, case law, HB 20?s text, and the Governor and state legislators? own statements all acknowledge that social media platforms exercise some form of editorial discretion, whether or not the State agrees with how that discretion is exercised.

And then, once it’s clear that moderating is the same as editorial discretion, it’s easy to see how the bill’s restrictions are a clear 1st Amendment problem. It does this, first, by highlighting the impossible choices the bill puts in front of social media companies, using the example of content about Nazis.

The State claims that social media platforms could prohibit content categories ?such as ?terrorist speech,? ?pornography,? ?spam,? or ?racism?? to prevent those content categories from flooding their platforms. (Resp. Prelim. Inj. Mot., Dkt. 39, at 21). During the hearing, the State explained that a social media platform ?can?t discriminate against users who post Nazi speech . . . and [not] discriminate against users who post speech about the antiwhite or something like that.? (See Minute Entry, Dkt. 47). Plaintiffs point out the fallacy in the State?s assertion with an example: a video of Adolf Hitler making a speech, in one context the viewpoint is promoting Nazism, and a platform should be able to moderate that content, and in another context the viewpoint is pointing out the atrocities of the Holocaust, and a platform should be able to disseminate that content. (See id.). HB 20 seems to place social media platforms in the untenable position of choosing, for example, to promote Nazism against its wishes or ban Nazism as a content category. (Prelim. Inj. Mot., Dkt. 12, at 29). As YouTube put it, ?YouTube will face an impossible choice between (1) risking liability by moderating content identified to violate its standards or (2) subjecting YouTube?s community to harm by allowing violative content to remain on the site.?

And thus:

HB 20?s prohibitions on ?censorship? and constraints on how social media platforms disseminate content violate the First Amendment.

Why?

HB 20 compels social media platforms to significantly alter and distort their products. Moreover, ?the targets of the statutes at issue are the editorial judgments themselves? and the ?announced purpose of balancing the discussion?reining in the ideology of the large social-media providers?is precisely the kind of state action held unconstitutional in Tornillo, Hurley, and PG&E.? Id. HB 20 also impermissibly burdens social media platforms? own speech. Id. at *9 (?[T]he statutes compel the platforms to change their own speech in other respects, including, for example, by dictating how the platforms may arrange speech on their sites.?). For example, if a platform appends its own speech to label a post as misinformation, the platform may be discriminating against that user?s viewpoint by adding its own disclaimer. HB 20 restricts social media platforms? First Amendment right to engage in expression when they disagree with or object to content.

At this point, the court dismisses, in a footnote, the two cases that very silly people always bring up: Pruneyard and Rumsfeld. Pruneyard is the very unique shopping mall case, which has very limited reach, and Rumsfeld is about a university allowing or not allowing military recruiters on campus. Supporters of efforts to force websites to host speech point to both cases as some sort of “proof” that it’s okay to compel speech, but both are very narrowly focused, and anyone relying on either is doing a bad faith “well, in these cases you could compel speech, so in this case obviously you can as well.” But the judge isn’t having any of it.

The Court notes that two other Supreme Court cases address this topic, but neither applies here. PruneYard Shopping Center v. Robins is distinguishable from the facts of this case. 447 U.S. 74 (1980). In PruneYard, the Supreme Court upheld a California law that required a shopping mall to host people collecting petition signatures, concluding there was no ?intrusion into the function of editors? since the shopping mall?s operation of its business lacked an editorial function. Id. at 88. Critically, the shopping mall did not engage in expression and ?the [mall] owner did not even allege that he objected to the content of the [speech]; nor was the access right content based.? PG&E, 475 U.S. at 12. Similarly, Rumsfeld v. Forum for Academic & Institutional Rights, Inc. has no bearing on this Court?s holding because it did not involve government restrictions on editorial functions. 547 U.S. 47 (2006). The challenged law required schools that allowed employment recruiters on campus to also allow military employment recruiters on campus?a restriction on ?conduct, not speech.? Id. at 62, 65. As the Supreme Court explained, ?accommodating the military?s message does not affect the law schools? speech, because the schools are not speaking when the host interviews and recruiting receptions.?

Even more importantly, the court rejects the transparency requirements in HB20. Again, this part was one that some people thought might slide through and be left in place. We’ve discussed, multiple times, how transparency on these issues is important, but that mandated transparency actually creates serious problems. The court, thankfully, agrees.

To pass constitutional muster, disclosure requirements like these must require only ?factual and noncontroversial information? and cannot be ?unjustified or unduly burdensome.? NIFLA, 138 S. Ct. at 2372. Section 2?s disclosure and operational provisions are inordinately burdensome given the unfathomably large numbers of posts on these sites and apps. For example, in three months in 2021, Facebook removed 8.8 million pieces of ?bullying and harassment content,? 9.8 million pieces of ?organized hate content,? and 25.2 million pieces of ?hate speech content.? (CCIA Decl., Dkt. 12-1, at 15). During the last three months of 2020, YouTube removed just over 2 million channels and over 9 million videos because they violated its policies. (Id. at 16). While some of those removals are subject to an existing appeals process, many removals are not. For example, in a three-month period in 2021, YouTube removed 1.16 billion comments. (YouTube Decl., Dkt. 12-3, at 23?24). Those 1.16 billion removals were not appealable, but, under HB 20, they would have to be. (Id.). Over the span of six months in 2018, Facebook, Google, and Twitter took action on over 5 billion accounts or user submissions?including 3 billion cases of spam, 57 million cases of pornography, 17 million cases of content regarding child safety, and 12 million cases of extremism, hate speech, and terrorist speech. (NetChoice Decl., Dkt. 12-2, at 8). During the State?s deposition of Neil Christopher Potts (?Potts?), who is Facebook?s Vice President of Trust and Safety Policy, Potts stated that it would be ?impossible? for Facebook ?to comply with anything by December 1, [2021]. . . [W]e would not be able to change systems in that nature. . . . I don?t see a way that we would actually be able to go forward with compliance in a meaningful way.? (Potts Depo., Dkt. 39-2, at 2, 46). Plaintiffs also express a concern that revealing ?algorithms or procedures that determine results on the platform? may reveal trade secrets or confidential and competitively-sensitive information. (Id. at 34) (quoting Tex. Bus. & Com. Code ? 120.051(a)(4)).

The Section 2 requirements burden First Amendment expression by ?forc[ing] elements of civil society to speak when they otherwise would have refrained.? Washington Post v. McManus, 944 F.3d 506, 514 (4th Cir. 2019). ?It is the presence of compulsion from the state itself that compromises the First Amendment.? Id. at 515. The provisions also impose unduly burdensome disclosure requirements on social media platforms ?that will chill their protected speech.? NIFLA, 138 S. Ct. at 2378. The consequences of noncompliance also chill the social media platforms? speech and application of their content moderation policies and user agreements. Noncompliance can subject social media platforms to serious consequences. The Texas Attorney General may seek injunctive relief and collect attorney?s fees and ?reasonable investigative costs? if successful in obtaining injunctive relief. Id. ? 120.151.

I’ll just note that we had just mentioned that Washington Post v. McManus case earlier this week in calling out the Washington Post’s hypocrisy in calling for mandatory disclosure rules for internet companies…

And Judge Pitman isn’t done yet with the constitutional problems of HB20.

HB 20 additionally suffers from constitutional defects because it discriminates based on content and speaker. First, HB 20 excludes two types of content from its prohibition on content moderation and permits social media platforms to moderate content: (1) that ?is the subject of a referral or request from an organization with the purpose of preventing the sexual exploitation of children and protecting survivors of sexual abuse from ongoing harassment,? and (2) that ?directly incites criminal activity or consists of specific threats of violence targeted against a person or group because of their race, color, disability, religion, national origin or ancestry, age, sex, or status as a peace officer or judge.? Tex. Civ. Prac. & Rem. Code ? 143A.006(a)(2)?(3). When considering a city ordinance that applied to ??fighting words? that . . . provoke violence[] ?on the basis of race, color, creed, religion[,] or gender,?? the Supreme Court noted that those ?who wish to use ?fighting words? in connection with other ideas?to express hostility, for example, on the basis of political affiliation, union membership, or []sexuality?are not covered.? R.A.V. v. City of St. Paul, Minn., 505 U.S. 377, 391 (1992). As Plaintiffs argue, the State has ?no legitimate reason to allow the platforms to enforce their policies over threats based only on . . . favored criteria but not? other criteria like sexual orientation, military service, or union membership. (Prelim. Inj. Mot., Dkt. 12, at 35?36); see id.

There’s also some good language in here for those who keep insisting that setting (often arbitrary) size barriers or carveouts on these laws is perfectly fine. Not so if they lead to discriminatory impact on venues for speech:

HB 20 applies only to social media platforms of a certain size: platforms with 50 million monthly active users in the United States. Tex. Bus. & Com. Code ? 120.002(b). HB 20 excludes social media platforms such as Parler and sports and news websites. (See Prelim. Inj. Mot., Dkt. 12, at 17). During the regular legislative session, a state senator unsuccessfully proposed lowering the threshold to 25 million monthly users in an effort to include sites like ?Parler and Gab, which are popular among conservatives.? Shawn Mulcahy, Texas Senate approves bill to stop social media companies from banning Texans for political views, TEX. TRIBUNE (Mar. 30, 2021), https://www.texastribune.org/2021/03/30/texas-social-media-censorship/. ?[D]iscrimination between speakers is often a tell for content discrimination.? NetChoice, 2021 WL 2690876, at *10. The discrimination between speakers has special significance in the context of media because ?[r]egulations that discriminate among media, or among different speakers within a single medium, often present serious First Amendment concerns.? Turner Broad. Sys., Inc. v. F.C.C., 512 U.S. 622, 659 (1994). The record in this case confirms that the Legislature intended to target large social media platforms perceived as being biased against conservative views and the State?s disagreement with the social media platforms? editorial discretion over their platforms. The evidence thus suggests that the State discriminated between social media platforms (or speakers) for reasons that do not stand up to scrutiny.

And, of course, everyone’s favorite: HB 20 is unconstitutionally vague.

First, Plaintiffs take issue with HB 20?s definition for ?censor:? ?block, ban, remove, deplatform, demonetize, de-boost, restrict, deny equal access or visibility to, or otherwise discriminate against expression.? Tex. Civ. Prac. & Rem. Code ? 143A.001(1). Plaintiffs argue that requiring social media platforms to require ?equal access or visibility to? content is ?hopelessly indeterminate.? (Prelim. Inj. Mot., Dkt. 12, at 37) (quoting id.). The Court agrees. A social media platform is not static snapshot in time like a hard copy newspaper. It strikes the Court as nearly impossible for a social media platform?that has at least 50 million users?to determine whether any single piece of content has ?equal access or visibility? versus another piece of content given the huge numbers of users and content. Moreover, this requirement could ?prohibit[] a social media platform from? displaying content ?in the proper feeds?

There are some other drafting oddities that the Judge calls out including this one:

HB 20 empowers the Texas Attorney General to seek an injunction not just against violations of the statute but also ?potential violations.? Tex. Civ. Prac. & Rem. Code ? 143A.008. Unlike other statutes that specify that the potential violation must be imminent, HB 20 includes no such qualification. See, e.g., Tex. Occ. Code ? 1101.752(a) (authorizing the attorney general to seek injunctive relief to abate a potential violation ?if the commission determines that a person has violated or is about to violate this chapter?). Subjecting social media platforms to suit for potential violations, without a qualification, reaches almost all content moderation decisions platforms might make, further chilling their First Amendment rights.

As in the Florida case, the court here notes that even if there were some reason under which the law should be judged under intermediate, rather than strict, scrutiny, it would still fail.

HB 20 imposes content-based, viewpoint-based, and speaker-based restrictions that trigger strict scrutiny. Strict scrutiny is satisfied only if a state has adopted ??the least restrictive means of achieving a compelling state interest.?? Americans for Prosperity Found. v. Bonta, 141 S. Ct. 2373, 2383, 210 L. Ed. 2d 716 (2021) (quoting McCullen v. Coakley, 573 U.S. 464, 478 (2014)). Even under the less rigorous intermediate scrutiny, the State must prove that HB 20 is ??narrowly tailed to serve a significant government interest.?? Packingham v. North Carolina, 137 S. Ct. 1730, 1736 (2017) (quoting McCullen, 573 U.S. at 477). The proclaimed government interests here fall short under both standards.

It’s not even a difficult call. It’s the kind of “duh” explanation that made it easy for us to say upfront that this law was so obviously unconstitutional:

The State?s first interest fails on several accounts. First, social media platforms are privately owned platforms, not public forums. Second, this Court has found that the covered social media platforms are not common carriers. Even if they were, the State provides no convincing support for recognizing a governmental interest in the free and unobstructed use of common carriers? information conduits. Third, the Supreme Court rejected an identical government interest in Tornillo. In Tornillo, Florida argued that ?government has an obligation to ensure that a wide variety of views reach the public.? Tornillo, 418 U.S. at 247?48. After detailing the ?problems related to government-enforced access,? the Court held that the state could not commandeer private companies to facilitate that access, even in the name of reducing the ?abuses of bias and manipulative reportage [that] are . . . said to be the result of the vast accumulations of unreviewable power in the modern media empires.? Id. at 250, 254. The State?s second interest?preventing ?discrimination? by social media platforms?has been rejected by the Supreme Court. Even given a state?s general interest in anti-discrimination laws, ?forbidding acts of discrimination? is ?a decidedly fatal objective? for the First Amendment?s ?free speech commands.?…

And, the court practically laughs out loud at the idea that HB 20 was “narrowly tailored.”

Even if the State?s purported interests were compelling and significant, HB 20 is not narrowly tailored. Sections 2 and 7 contain broad provisions with far-reaching, serious consequences. When reviewing the similar statute passed in Florida, the Northern District of Florida found that that statute was not narrowly tailored ?like prior First Amendment restrictions.? NetChoice, 2021 WL 2690876, at *11 (citing Reno, 521 U.S. at 882; Sable Commc?n of Cal., Inc. v. FCC, 492 U.S. 115, 131 (1989)). Rather, the court colorfully described it as ?an instance of burning the house to roast a pig.? Id. This Court could not do better in describing HB 20.

End result: injunction granted, the law does not go into effect today as originally planned. Texas will undoubtedly now appeal, and we can only hope the 5th Circuit doesn’t muck things up, as it’s been known to do. Depending on how this plays out, as well as how the 11th Circuit handles the Florida case, it’s possible this could hit the Supreme Court down the road. Hopefully, both the 11th and the 5th actually take heed of Justice Kavanaugh’s words in the Halleck case, and choose to uphold both district court rulings — and we can get past this silly Trump-inspired moral panic attack on the 1st Amendment rights of social media platforms — the very same rights that enable them to create spaces for us to speak and share our own ideas.

Filed Under: 1st amendment, content moderation, florida, hb20, section 230, social media, strict scrutiny, texas
Companies: ccia, netchoice