kids code – Techdirt (original) (raw)

I Explained To A Court How California’s ‘Kid’s Code’ Is Both Impossible To Comply With & An Attack On Our Expression

from the the-wrong-approach dept

Last year, Techdirt was one of only a very few sites where you could find out information on California’s AB 2273, officially the “California Age Appropriate Design Code” or “Kid’s code.” As with so many bills that talk about “protecting the children,” everyone we talked to said they were afraid to speak up, because they worried that they’d be branded as being against child safety. Indeed, I even had some people within some larger tech companies reach out to me suggesting it was dangerous to speak out against the bill.

But the law is ridiculous. Last August, I explained how it was literally impossible to comply with the bill, questioned why California lawmakers were willing to pass a law written by a British Baroness (who is also a Hollywood filmmaker) with little to no understanding of how any of this actually works, and highlighted how the age verification requirements would be a privacy nightmare putting more kids at risk, rather than protecting them. Eric Goldman also pointed out the dark irony, that while the Kid’s Code claims that it was put in place to prevent internet companies from conducting radical experiments on children, the bill itself is an incredibly radical experiment in trying to reshape the internet. Of course, the bill was signed into law last fall.

In December, NetChoice, which brought the challenges to Texas and Florida’s bad internet laws, sued to block the law. Last week, they filed for a preliminary injunction to block the law from going into effect. Even though the law doesn’t officially take effect until the summer of 2024, any website would need to start doing a ton of work to get ready. With the filing, there were a series of declarations filed from various website owners to highlight the many, many problems this law will create for sites (especially smaller sites). Among those declarations was the one I filed highlighting how this law is impossible to comply with, would invade the privacy of the Techdirt community, and act as an unconstitutional restriction on speech. But we’ll get to that.

First up, the motion for the injunction. It’s worth reading the whole thing as it details the myriad ways in which this law is unconstitutional. It violates the 1st Amendment by creating prior restraint in multiple ways. The law is both extremely vague and overly broad. It regulates speech based on its content (again violating the 1st Amendment). It also violates the Commerce Clause as a California law that would impact those well outside of the state. Finally, existing federal law, both COPPA and Section 230 pre-empt the law. I won’t go through it all, but all of those are clearly laid out in the motion.

But what I appreciate most is that it opens up with a hypothetical that should illustrate just how obviously unconstitutional the law is:

Imagine a law that required bookstores, before offering books and services to the public, to assess whether those books and services could “potentially harm” their youngest patrons; develop plans to “mitigate or eliminate” any such risks; and provide those assessments to the state on demand. Under this law, bookstores could only carry books the state deemed “appropriate” for young children unless they verified the age of each patron at the door. Absent such age verification, employees could not ask customers about the types of books they preferred or whether they had enjoyed specific titles—let alone recommend a book based on customers’ expressed interests—without a “compelling” reason that doing so was in the “best interests” of children. And the law would require bookstores to enforce their store rules and content standards to the state’s satisfaction, eliminating the bookstores’ discretion as to how those rules should be applied. Penalties for violations could easily bankrupt even large bookstores. Such a scheme would plainly violate fundamental constitutional protections.

California has enacted just such a measure: The California Age Appropriate Design Code Act (AB 2273). Although billed as a “data protection” regulation to protect minors, AB 2273 is the most extensive attempt by any state to censor speech since the birth of the internet. It does this even though the State has conceded that an open, vibrant internet is indispensable to American life. AB 2273 enacts a system of prior restraint over protected speech using undefined, vague terms, and creates a regime of proxy censorship, forcing online services to restrict speech in ways the State could never do directly. The law violates the First Amendment and the Commerce Clause, and is preempted by the Children’s Online Privacy Protection Act (COPPA), 15 U.S.C. §§ 6501 et seq., and Section 230 of the Communications Decency Act, 47 U.S.C. § 230. Because AB 2273 forces online providers to act now to redesign services, irrespective of its formal effective date, it will cause imminent irreparable harm. The Court should enjoin the statute.

As for my own filing, it was important for me to make clear that a law like AB 2273 is a direct attack on Techdirt and its users’ expression.

Techdirt understands that AB 2273 will require covered businesses to evaluate and mitigate the risk that “potentially harmful content” will reach children, with children defined to equally cover every age from 0 to 18 despite the substantial differences in developmental readiness and ability to engage in the world around them throughout that nearly two-decade age range. This entire endeavor results in the State directly interfering with my company’s and my expressive rights by limiting to whom and how we can communicate to others. I publish Techdirt with the deliberate intention to share my views (and those of other authors) with the public. This law will inhibit my ability to do so in concrete and measurable ways.

In addition to its overreaching impact, the law’s prohibitions also create chilling ambiguity, such as in its use of the word “harm.” In the context of the issues that Techdirt covers on a daily basis, there is no feasible way that Techdirt can determine whether any number of its articles could, in one way or another, expose a child to “potentially harmful” content, however the State defines that phrase according to the political climate of the moment. For example, Techdirt covers a broad array of hot-button topics, including reporting on combating police brutality (sometimes with accompanying images and videos), online child sexual abuse, bullying, digital sexual harassment, and law enforcement interrogations of minors—all of which could theoretically be deemed by the State to be “potentially harmful” to children. Moreover, Techdirt’s articles are known for their irreverent and snarky tone, and frequently use curse words in their content and taglines. It would be impossible to know whether this choice of language constitutes “potentially harmful content” given the absence of any clear definition of the term in AB 2273. Screening Techdirt’s forum for “potentially harmful” content—and requiring Techdirt to self-report the ways its content and operations could hypothetically “harm” children—will thus cause Techdirt to avoid publishing or hosting content that could even remotely invite controversy, undermining Techdirt’s ability to foster lively and uninhibited debate on a wide range of topics of its choosing. Moreover, not only would Techdirt’s prospective expression be chilled, but the retroactive application of AB 2273 would result in Techdirt needing to censor its previous expression, and to an enormous degree. The sheer number of posts and comments published on Techdirt makes the self-assessment needed to comply with the law’s ill-defined rules functionally impossible, requiring an enormous allocation of resources that Techdirt is unable to dedicate.

Also, the age verification requirements would fundamentally put the privacy of all of our readers at risk by forcing us to collect data we do not want about our users, and which we’ve gone to great lengths to make sure is not collected.

Redesigning our publication to verify the ages of our readers would also compromise our deliberate practice to minimize how much data we collect and retain about our readers to both limit our obligations that would arise from the handling of such data as well as preserve trust with our readers and undermine our relationship with our readers of any age, including teenagers, by subjecting them to technologies that are at best, unreliable, and at worst, highly privacy-intrusive (such as facial recognition). Moreover, because a sizeable portion of Techdirt’s readership consists of casual readers who access the site for information and news, any requirement that forces users to submit extensive personal information simply to access Techdirt’s content risks driving away these readers and shrinking Techdirt’s audience.

I have no idea how the courts are going to treat this law. Again, it does feel like many in the industry have decided to embrace and support this kind of regulation. I’ve heard from too many people inside the industry who have said not to speak up about it. But it’s such a fundamentally dangerous bill, with an approach that we’re starting to see show up in other states, that it was too important not to speak up.

Filed Under: 1st amendment, aadc, ab 2273, age appropriate design code, age verification, facial scanning, free expression, kids code, privacy
Companies: netchoice

Kids Use Discord Chat To Track Predator Teacher’s Actions; Under California’s Kids Code, They’d Be Blocked

from the be-careful-how-you-"protect"-those-children dept

It’s often kind of amazing at how much moral panics by adults treat kids as if they’re completely stupid, and unable to do anything themselves. It’s a common theme in all sorts of moral panics, where adults insist that because some bad things could happen, they must be prevented entirely without ever considering that maybe a large percentage of kids are capable enough to deal with the risks and dangers themselves.

The Boston Globe recently had an interesting article about how a group of middle school boys were able to use Discord to successfully track the creepy, disgusting, and inappropriate shit one of their teachers/coaches did towards their female classmates, and how that data is now being used in an investigation of the teacher, who has been put on leave.

In an exclusive interview with The Boston Globe, one of the boys described how in January 2021,he and his friends decided to start their “Pedo Database,” to track the teacher’s words and actions.

There’s even a (redacted) screenshot of the start of the channel.

The kids self-organized and used Discord as a useful tool for tracking the problematic interactions.

During COVID, as they attended class online, they’d open the Discord channel on a split-screen and document the teacher’s comments in real time:

“You all love me so choose love.”

“You gotta stand up and dance now.”

Everyone “in bathing suits tomorrow.”

Once they were back in class in person, the boys jotted down notes to add to the channel later: Flirting with one girl. Teasing another. Calling the girls “sweetheart” and “sunshine.” Asking one girl to take off her shoes and try wiggling her toes without moving her pinkies.

“I felt bad for [the girls] because sometimes it just seems like it was a humiliating thing,” the boy told the Globe. “He’d play a song and he’d make one of them get up and dance.”

When the school year ended, the boys told incoming students about the Discord channel and encouraged them to keep tabs on the teacher. All in all, eight boys were involved, he said.

Eventually, the teacher was removed from the school and put on leave, after the administration began an investigation following claims that “the teacher had stalked a pre-teen girl at the middle school while he was her coach, and had been inappropriate with other girls.”

The article notes that there had been multiple claims in the past against the teacher, but that other teachers and administrators long protected the teacher. Indeed, apparently the teacher bragged about how he’d survived such complaints for decades. And that’s when the kids stepped up and realized they needed to start doing something themselves.

“I don’t think there was a single adult who would ever — like their parents, my mom, like anybody in the school — who had ever really taken the whole thing seriously before,” he added.

The boy’s mother contacted Conlon, and now the “Pedo Database” is in the hands of the US attorney’s Office, the state Department of Children, Youth, and Families, the state Department of Education, and with lawyer Matthew Oliverio, who is conducting the school’s internal investigation.

“I did not ever think this would actually be used as evidence, but we always had it as if it was,” said the boy, who is now 15 and a student at North Kingstown High School. “So I’m glad that we did, even though it might have seemed like slightly stupid at times.”

So, here we have kids who used the internet to keep track of a teacher accused of preying on children. Seems like a good example of helping to protect children.

Yet, it seems worth noting that under various “protect the children” laws, this kind of activity would likely be blocked. Already, under COPPA, it’s questionable if the kids should even be allowed on Discord. Discord, like many websites, limits usage in its terms of service to those 13 years or older. That’s likely in an attempt to comply with COPPA. But, the article notes that the kids started keeping this database as 6th graders, when they were likely 11-years old.

Also, under California’s AB 2273, Discord likely would have been more aggressive in banning them, as it would have had to employ much more stringent age verification tools that likely would have barred them from the service entirely. Also, given the other requirements of the “Age Appropriate Design Code,” it seems likely that Discord would be doing things like barring a chat channel described as a “pedo database.” A bunch of kids discussing possible pedophilia? Clearly that should be blocked as potentially harmful.

So, once again, the law, rather than protecting kids, might have actually put them more at risk, and done more to actually protect adults who were putting kids’ safety at risk.

Filed Under: ab 2273, age appropriate design code, kids, kids code, teachers
Companies: discord