Court Tosses Arkansas Age Verification Law For Violating The 1st Amendment (original) (raw)

from the two-down dept

Just after a judge granted an injunction against Texas’ adult content age verification law on 1st Amendment grounds, a judge in Arkansas did the same to that state’s social media age verification law. Trade organization NetChoice had challenged the law, and the court basically gave them a complete and total victory.

Just like the ruling in Texas, the opinion here is a good read. As with Texas, Arkansas relied on Tony Allen, who represents the age verification providers, to claim that the technology works great and the laws are fine. As in Texas, the court here is not convinced.

Also, as with Texas, the state in Arkansas had challenged the standing of the organization bringing the suit, and the court rejects that challenge. We’ll skip over the details because it’s just not that interesting. The important stuff is the 1st Amendment analysis.

First, the court looks to see if the law should be rejected on 1st Amendment grounds for being too vague (the Texas court talked about the vagueness issues, but didn’t rule on that point, only using the vague language to emphasize how the law was not narrowly tailored). Here, the court explains in detail how Arkansas’ law is way too vague:

Here, Act 689 is unconstitutionally vague because it fails to adequately define which entities are subject to its requirements. A “social media company” is defined as “an online forum that a company makes available for an account holder” to “[c]reate a public profile, establish an account, or register as a user for the primary purpose of interacting socially with other profiles and accounts,” “[u]pload or create posts or content,” “[v]iew posts or content of other account holders,” and “[i]nteract with other account holders or users, including without limitation establishing mutual connections through request and acceptance.” Act 689 at § 1101(7)(A) (emphasis added). But the statute neither defines “primary purpose”—a term critical to determining which entities fall within Act 689’s scope—nor provides any guidelines about how to determine a forum’s “primary purpose,” leaving companies to choose between risking unpredictable and arbitrary enforcement (backed by civil penalties, attorneys’ fees, and potential criminal sanctions) and trying to implement the Act’s costly age-verification requirements. Such ambiguity renders a law unconstitutional.

Amusingly, while Arkansas tried to claim it was obvious who was covered, NetChoice got a SnapChat exec to admit that he thought the company was not covered by the law until he heard one of the law’s co-sponsors say otherwise:

The State argues that Act 689’s definitions are clear and that “any person of ordinary intelligence can tell that [Act 689] regulates Meta, Twitter[,] and TikTok.” (Doc. 34, p. 20). But what about other platforms, like Snapchat? David Boyle, Snapchat’s Senior Director of Products, stated in his Declaration that he was not sure whether his company would be regulated by Act 689. He initially suspected that Snapchat would be exempt until he read a news report quoting one of Act 689’s co-sponsors who claimed Snapchat was specifically targeted for regulation.

For what it’s worth, the law is actually so vague that when we wrote about it after it was signed, we noted that it actually could be read to say that TikTok, SnapChat, and YouTube were all excluded. So, for the state to claim it’s obvious who’s covered is laughable.

Apparently, even Arkansas’s lawyers and its expert witness couldn’t agree on what the law covered (oops!):

During the evidentiary hearing, the Court asked the State’s expert, Mr. Allen, whether he believed Snapchat met Act 689’s definition of a regulated “social media company.” He responded in the affirmative, explaining that Snapchat’s “primary purpose” matched Act 689’s definition of a “social media company” (provided it was true that Snapchat also met the Act’s profitability requirements). When the Court asked the same question to the State’s attorney later on in the hearing, he gave a contrary answer—which illustrates the ambiguous nature of key terms in Act 689. The State’s attorney disagreed with Mr. Allen—his own witness—and said the State’s official position was that Snapchat was not subject to regulation because of its “primary purpose.”

Yeah, I’m gonna say your law is pretty damn vague when that happens. The court then details how other parts of the law are equally vague, including the lack of definitions of “predominant function” and “substantial function” which are both important in determining who the law applies to. Also this:

Act 689 also fails to define what type of proof will be sufficient to demonstrate that a platform has obtained the “express consent of a parent or legal guardian.” Id. at § 1102(a). If a parent wants to give her child permission to create an account, but the parent and the child have different last names, it is not clear what, if anything, the social media company or third-party servicer must do to prove a parental relationship exists. And if a child is the product of divorced parents who disagree about parental permission, proof of express consent will be that much trickier to establish—especially without guidance from the State.

And the judge notes that the state’s own expert, Mr. Allen, more or less admitted there was no clear way to determine who was a legal guardian for a child under the law.

Then we get to the 1st Amendment specifics. Here the court explores the same topic that a judge in California is currently considering with regards to California’s Age Appropriate Design Code. Namely: is an age verification mandate content-based or content-neutral?

As in California, Arkansas couldn’t resist giving up the ballgame, by effectively admitting that the goal of age verification is to suppress certain kinds of speech.

On the other hand, the State points to certain speech-related content on social media that it maintains is harmful for children to view. Some of this content is not constitutionally protected speech, while other content, though potentially damaging or distressing, especially to younger minors, is likely protected nonetheless. Examples of this type of speech include depictions and discussions of violence or self-harming, information about dieting, so-called “bullying” speech, or speech targeting a speaker’s physical appearance, race or ethnicity, sexual orientation, or gender. If the State’s purpose is to restrict access to constitutionally protected speech based on the State’s belief that such speech is harmful to minors, then arguably Act 689 would be subject to strict scrutiny

Thankfully, the judge shot down Arkansas’ attempt to say that this is no different than restricting kids access to a bar or a casino. As we’ve pointed out over and over again, there’s a big difference with social media, where you’re dealing with speech. That’s not the case with a casino or a bar. The judge agrees, calling that argument “weak.”

The State’s briefing analogized Act 689 to a restriction on minors entering a bar or a casino. But this analogy is weak. After all, minors have no constitutional right to consume alcohol, and the primary purpose of a bar is to serve alcohol. By contrast, the primary purpose of a social media platform is to engage in speech, and the State stipulated that social media platforms contain vast amounts of constitutionally protected speech for both adults and minors. Furthermore, Act 689 imposes much broader “location restrictions” than a bar does.

Somewhat hilariously, the judge cites an exchange he had with Arkansas’ lawyers before saying “clearly, the State’s analogy is not persuasive.”

THE COURT: Well, to pick up on Mr. Allen’s analogy of the mall, I haven’t been to the Northwest Arkansas mall in a while, but it used to be that there was a restaurant inside the mall that had a bar. And so certainly minors could not go sit at the bar and order up a drink, but they could go to the Barnes & Noble bookstore or the clothing store or the athletic store. Again, borrowing Mr. Allen’s analogy, the gatekeeping that Act 689 imposes is at the front door of the mall, not the bar inside the mall; yes?

THE STATE: The state’s position is that the whole mall is a bar, if you want to continue to use the analogy.

THE COURT: The whole mall is a bar?

THE STATE: Correct.

Your speech suppression law might just be unconstitutional when you’re admitting to a judge that the equivalent would be banning kids from an entire mall because there’s a bar within one restaurant.

Even though the court says that strict scrutiny almost certainly applies, it decided to test the law under intermediate scrutiny (which is what the state wanted) and finds that it still fails to pass 1st Amendment muster. With strict scrutiny, you have to show a compelling government interest and that the law is narrowly tailored to only limit the speech in question, and that there are no better alternatives. With intermediate scrutiny, the law still needs to be narrowly tailored to a significant government interest (rather than a compelling state interest), and the law has to be “substantially related” to reaching that objective, rather as narrowly tailored as strict scrutiny requires.

And, still, the Arkansas law fails. First off, the law clearly creates chilling effects:

It is likely that many adults who otherwise would be interested in becoming account holders on regulated social media platforms will be deterred—and their speech chilled—as a result of the age-verification requirements, which, as Mr. Allen testified, will likely require them to upload official government documents and submit to biometric scans.

And, finally, we see a discussion of the impact on kids’ free speech rights (remember, they have those as well). So many of these discussions forget that kids have rights as well, but the judge here remembered:

Act 689 bars minors from opening accounts on a variety of social media platforms, despite the fact that those same platforms contain vast quantities of constitutionally protected speech, even as to minors. It follows that Act 689 obviously burdens minors’ First Amendment Rights….

[….]

Neither the State’s experts nor its secondary sources claim that the majority of content available on the social media platforms regulated by Act 689 is damaging, harmful, or obscene as to minors. And even though the State’s goal of internet safety for minors is admirable, “the governmental interest in protecting children does not justify an unnecessarily broad suppression of speech addressed to adults.” Reno, 521 U.S. at 875; see also Brown, 564 U.S. at 804–05 (“Even where the protection of children is the object, the constitutional limits on governmental action apply.”).

What about how narrowly tailored the bill is? The judge is… not impressed, especially since the state cited sites that aren’t even subject to the law in defense of the law.

To begin with, the connection between these harms and “social media” is ill defined by the data. It bears mentioning that the State’s secondary sources refer to “social media” in a broad sense, though Act 689 regulates only some social media platforms and exempts many others. For example, YouTube is not regulated by Act 689, yet one of the State’s exhibits discussing the dangers minors face on “social media” specifically cites YouTube as being “the most popular online activity among children aged 3–17” and notes that “[a]mong all types of online platforms, YouTube was the most widely used by children . . . .”…

Likewise, another State exhibit published by the FBI noted that “gaming sites or video chat applications that feel familiar and safe [to minors]” are common places where adult predators engage in financial “sextortion” of minors. See State’s Hearing Exhibit 6. However, Act 689 exempts these platforms from compliance. Mr. Allen, the State’s expert, criticized the Act for being “very limited in terms of the numbers of organizations that are likely to be caught by it, possibly to the point where you can count them on your fingers. . . .”

The state tried to justify this by pointing to a NCMEC (National Center for Missing and Exploited Children) article that listed out the “most dangerous” sites, but the judge questioned the data here:

During the hearing, the Court observed that the data in the NCMEC article lacked context; the article listed raw numbers but did not account for the amount of online traffic and number of users present on each platform. The State’s attorney readily agreed, noting that “Facebook probably has the most people on it, so it’s going to have the most reports.” But he still opined that the NCMEC data was a sound way to target the most dangerous social media platforms, so “the highest volume [of reports] is probably where the law would be concentrated.”

Frankly, if the State claims Act 689’s inclusions and exemptions come from the data in the NCMEC article, it appears the drafters of the Act did not read the article carefully. Act 689 regulates Facebook and Instagram, the platforms with the two highest numbers of reports. But, the Act exempts Google, WhatsApp, Omegle, and Snapchat— the sites with the third-, fourth-, fifth-, and sixth-highest numbers of reports. Nextdoor is at the very bottom of NCMEC’s list, with only one report of suspected child sexual exploitation all year, yet the State’s attorney noted during the hearing that Nextdoor would be subject to regulation under Act 689.

Ouch!

Also this:

None of the experts and sources cited by the State indicate that risks to minors are greater on platforms that generate more than $100 million annually. Instead, the research suggests that it is the amount of time that a minor spends unsupervised online and the content that he or she encounters there that matters. However, Act 689 does not address time spent on social media; it only deals with account creation. In other words, once a minor receives parental consent to have an account, Act 689 has no bearing on how much time the minor spends online. Using the State’s analogy, if a social media platform is like a bar, Act 689 contemplates parents dropping their children off at the bar without ever having to pick them up again. The Act only requires parents to give express permission to create an account on a regulated social media platform once. After that, it does not require parents to utilize content filters or other controls or monitor their children’s online experiences

It kinda sounds like those drafting these laws (1) have no idea what they’re talking about and (2) don’t much care to find out.

The judge is equally unimpressed by the point of parental permission for account signups:

The State’s brief argues that “requiring a minor to have parental authorization to make a profile on a social media site . . . . means that many minors will be protected from the well-documented mental health harms present on social media because their parents will have to be involved in their profile creation” and are therefore “more likely to be involved in their minor’s online experience.” (Doc. 34, p. 19). But this is just an assumption on the State’s part, and there is no evidence of record to show that a parent’s involvement in account creation signals an intent to be involved in the child’s online experiences thereafter.

The court even points out that the UK’s (very problematic!) Online Safety Bill seems to be “more consistent” with US Supreme Court precedent than Arkansas’ law is.

Consider the differences between Act 689 and the UK’s Online Safety Bill. Mr. Allen, who worked on the UK legislation, testified that the UK’s main concern was preventing minors from accessing particular content, whereas Arkansas will require age verification at the time of account creation, regardless of the content. It appears the UK’s approach is more consistent with Supreme Court precedent than Arkansas’s approach. In Packingham, the Court observed that it was possible for a state to “enact specific, narrowly tailored laws” targeted to “conduct that often presages a sexual crime, like contacting a minor or using a website to gather information about a minor”; but it would be unconstitutional for a state to unduly burden adult access to social media

End result: Arkansas’ law likely violates the 1st Amendment with vague language and its broad applicability beyond the very narrow interests of the state. And that would strip adults of their own 1st Amendment rights through chilling effects, as well as kids’ own rights to access information.

In sum, NetChoice is likely to succeed on the merits of the First Amendment claim it raises on behalf of Arkansas users of member platforms. The State’s solution to the very real problems associated with minors’ time spent online and access to harmful content on social media is not narrowly tailored. Act 689 is likely to unduly burden adult and minor access to constitutionally protected speech. If the legislature’s goal in passing Act 689 was to protect minors from materials or interactions that could harm them online, there is no compelling evidence that the Act will be effective in achieving those goals.

End result, the law cannot go into effect.

It seems that Thursday was a great day for the 1st Amendment, and a bad day for state legislatures pushing age verification (and the age verification provider lobbyists).

Filed Under: 1st amendment, access to information, age verification, arkansas, free speech, social media, tony allen
Companies: netchoice