porn license – Techdirt (original) (raw)

Justin Trudeau Rightly Points Out That Internet Porn Licenses Are A Dumb Fucking Idea; But He Still Supports Internet Censorship Bill

from the one-out-of-two-is-still-not-great dept

With the UK moving forward with its plans for an online “porn license” for anyone to view adult content online, and with various state legislatures pushing to effectively require the same in the US, this dumb and counterproductive idea is proving quite popular among the political class.

Thankfully, not in Canada. Canadian Prime Minister Justin Trudeau (who hasn’t been great on internet issues during his tenure) at least recognizes just how stupid an idea it is to require a special license to view adult content online.

Prime Minister Justin Trudeau said Thursday that adults shouldn’t have to share their personal information to access pornography online.

Speaking at a housing announcement in Cape Breton, Trudeau said Conservative Leader Pierre Poilievre’s endorsement of some sort of age-verification system for porn sites is something his party opposes.

“He’s proposing that adults should have to give their ID and personal information to sketchy websites, or create a digital ID for adults to be able to browse the web where they want,” Trudeau said of Poilievre.

“That’s something we stand against.”

I mean, it’s a small thing, but at least we finally find a line that some politicians won’t cross.

That said, even the context here is still problematic. Trudeau is pushing a different terrible internet regulation bill, also based on a terrible idea from the UK: an “online harms” bill that will lead to widespread censorship.

Trudeau said Poilievre is “playing politics” by opposing the government’s forthcoming online harms bill — which is meant to combat hate speech, terrorist content and some violent material on the internet — while also endorsing a crackdown on some other online content.

Can’t win ‘em all, I guess. It’s sad that Canada continues to push these kinds of dangerous bills, even as its government should know better.

And, of course, it’s no surprise that days after Trudeau said all this, the new version of the Online Harms bill was officially introduced. It’s dangerous in all sorts of ways that we’ll be covering soon. So, while it’s nice that Trudeau stops at age verification for porn, the fact that he has no issue with an online censorship bill remains a real problem.

Filed Under: adult websites, age verification, canada, justin trudeau, online harms, porn license

Papers Please, But For Porn Scheduled For A 2025 Debut In The UK

from the sorry,-these-papers-are-kind-of-stuck-together dept

Stop-start. Push-pull. Yank-tug. That’s the way things have been going in the UK. One would expect better performance from lawmakers with a hard-on for porn.

No. Not that way. (Although, maybe that way.) The UK government has spent several years trying to talk service providers, recalcitrant legislators, and the general public into trading away a bit of their privacy to save the children from the scourge of online pornography.

Porn filters have been proposed, implemented, and abandoned. Age-verification methods have been proposed, examined, and re-examined. The proposal that has proven most resilient involves letting service providers know who you are, how old you are, and that you definitely intend to consume porn content.

This proposal obviously raises privacy concerns. While UK residents might be (reluctantly) willing to inform their service providers they’d like to see some pornography, they’re likely far less willing to notify their government of this same information. The government says it’s only interested in keeping those under the age of 18 away from adult sites, but the mechanisms for doing so necessitate the government being involved in some way with this gathering of very personal information from internet users.

Nonetheless, the UK government continues to insist this is the only practical option: demanding personally identifying info from porn fans. As Laurie Clarke reports for Politico, blocking access to porn sites by default will soon be the new normal in the United Kingdom. Time for everyone to reach into their pants to locate their… um… wallets, purses, etc.

Before diving into a sea of graphic content, they’ll first be asked to prove they’re over 18 — and this time, ticking a box won’t cut it.

Porn perusers will soon have to prove their age by uploading an identity document like a passport, registering a credit card, presenting their face to AI-powered scanning technology, or using a handful of other methods outlined in draft guidance from the regime’s regulator, Ofcom.

Sure, Ofcom may be seeking input, but it hardly seems like anyone’s opinions will matter. Comment all you want but it’s unlikely to change what’s coming: the debut of porn filtering that can only be removed by proving to providers (and, ultimately, the UK government) you are who you are and you are someone who wants to view porn.

The upshot of this move is that UK residents won’t stop trying to access porn. They’ll just start looking for it in places beyond the reach of UK legislators. That’s what’s happening in the United States, thanks to a handful of states passing legislation that requires porn sites to gather and retain personal information about their users.

Rather than gather incriminating information on behalf of a handful of state governments, US porn sites have simply decided to block users it believes reside in affected states. Traffic has plummeted at these sites as a result of these laws, but that hardly suggests most users were underage. Instead, it suggests people aren’t willing to share their porn viewing habits with government entities. Even if regulators can’t (currently) access this data, the perception is that they can… or will, as soon as they can come up with a justification for doing so.

Everyone’s less safe now, including the minors these laws were crafted to “protect.”

“These people did not stop looking for porn,” an [Pornhub parent company] Aylo spokesperson said. “They just migrated to darker corners of the internet that don’t ask users to verify age, that don’t follow the law, that don’t take user safety seriously, and that often don’t even moderate content.”

That’s going to happen in the UK, too. Ofcom knows this. And if Ofcom knows this, legislators should know this.

A survey commissioned by the regulator last year found that 55 percent of porn viewers said they would look for porn elsewhere if asked to verify their age, while only 29 percent said that they would comply.

The percentage of UK residents willing to look elsewhere for porn jumps to 80% when respondents were asked if they were willing to upload copies of identifying documents to websites to obtain access.

And it’s not just porn sites that will have to start demanding people’s papers upon entry. It’s also sites likes X, Reddit, Wikimedia, and other third-party content hosts that allow pornography on their sites. Locking minors out of these sites means denying them access to plenty of non-porn content that they might find useful, educational, or otherwise engaging.

Then there are the even more problematic aspects of instituting this policy. Erecting a wall seems like a good idea until you realize everyone else has already found a way around it. Ask anyone who’s instituted a paywall how that’s going. Regulators in the UK are actually considering heading down the road to totalitarianism. You know, for the children.

To solve the issue of evasion with VPNs, “the answer is obviously to either impose age assurance globally” or for porn sites to begin detecting and blocking VPN traffic, says Corby.

Restricting VPN use itself — usually a hallmark of autocratic regimes — has been promoted by Labour MPs as a potential means of preventing U.K. residents from circumventing the Online Safety Act.

As anyone with a bare minimum of world history under their belt can tell you, once you head down that road, it’s much easier to continue on than reverse direction. Restricting VPN use won’t just keep minors from accessing porn, but it will prevent journalists from talking to sources, businesses from maintaining secure remote connections, and inflict a lot of pain on people who simply don’t believe it’s anyone else’s business what they do online.

Filed Under: ofcom, porn, porn filters, porn license, uk

Unsurprisingly, Pornhub Blocks Arkansas IP Addresses

from the don't-blame-us-for-ruining-Gov-Huckabee’s-husband’s-morning-wood dept

It has been a busy day for Arkansas.

Pornhub.com geo-blocked IP addresses in Arkansas in the latest protest against unworkable age verification laws. Arkansas is the fifth state to have an age-gating statute enter force and is the fourth to be geo-blocked by the parent company of Pornhub, the Montréal-based firm MindGeek owned by Ethical Capital Partners in Ottawa. With a population of about 3 million people, the block on Arkansas adds to the growing number of blocked people in the United States — Earth’s largest consumer base for legal and consensual pornography. And, as we are seeing across the board, people aren’t happy with the block and it isn’t like these laws are going to stop people from watching porn. VPNs are gaining popularity, and not all porn sites are following these laws.

But, who is to blame for the Pornhub geo-block? Pornhub or Ethical Capital Partners? The state? It’s basic economics, folks. Generally speaking, reasonable regulations often make sense for various industries. Without government regulation, we too frequently end up with early Industrial Revolution-style labor quagmires: people get exploited, customers are at the whim of unaccountable executives, and a market ends up monopolizing. But those are general regulations that apply across the board to protect labor and customers.

There is a huge difference when regulations prevent entry or exit from a market for a variety of reasons, or when they target specific types of companies. The age verification laws in these states are textbook cases of misinformed regulation. In my time reporting on the porn industry, I have seen time and again do-gooder politicians who claim to have a moral imperative to “protect the kids.” Protecting the kids, in the eyes of such politicians, means restricting access to adult content and openly censoring otherwise First Amendment-protected forms of free speech and expression.

Arkansas Gov. Sarah Huckabee Sanders signed into law Senate Bill 66 requiring a government identification or a personal identification document to verify one’s age in order to wank. The state legislature, which is dominated by a Republican supermajority, claimed that the bill was a “bipartisan” show of concern for minors. Truthfully, this “bipartisanship” is exclusively based on a political necessity for Democrats in the minority to effect any sort of legislative change that is not blocked by the Q-anon laced policies of Gov. Sanders and her cronies in the state legislature.

It’s clear that Pornhub shouldn’t be blamed for this new development in the ongoing drama related to age verification in the United States. In a blog post, Pornhub said the reason they’re blocking entire states is the way these “new laws are executed by lawmakers is ineffective and puts users’ privacy at risk.” That’s absolutely true. The majority of these laws don’t consider the impact of potential data bloat, security risks, and other fucked-up ideas.

Also, the enforcement of these laws isn’t consistent or uniform. Given the nature of the federal system, there are clear shortcomings in the ability of U.S. states to effectively enforce these laws in an equitable manner. But what age verification laws try to do is regulate interstate commerce while lacking the constitutional prerogative to do so. Only Congress and the federal government through an act of Congress can regulate interstate commerce in ways that are presented in these age verification bills — age estimation tech, AI-assisted biometrics, and simple interventions such as requesting sensitive personally identifiable information over openly available, non-sensitive personally identifiable information that can be found via social media.

As I’ve written for Techdirt before, Pornhub and its ownership group are on record advocating for device-based age verification solutions that try to retain as little data as possible. They say so in the blog post, and a partner for Ethical Capital Partners told me the same thing several times in calls and texts throughout my reportage on the age verification push in Utah. This is additionally the case for a variety of other sites that want to comply with the law and be viewed as ethical, transparent, and responsible. But, there is no simple solution for ensuring trust and safety policies are effective on porn sites or social media platforms that permit uncensored nudity, like Reddit or OnlyFans.

Age verification laws are currently being challenged in federal district courts across the country as violations of the First and Fourteenth Amendments. The Free Speech Coalition, a trade group representing the adult entertainment industry, headlines plaintiff classes pressing courts in Utah and Louisiana to issue permanent injunctions against the implementation and enforcement of age verification laws. In Arkansas, NetChoice filed a lawsuit against the state government asking a federal judge to block the Social Media Safety Act, an age verification measure requiring a user or a parent to submit identification material in order to create new accounts. Collectively, these proposals are simply unworkable ideological statements that have little chance of surviving judicial review. Plus, it goes to show how backward conservative politicians can be on free speech topics.

The age verification law enters into force tomorrow, August 1.

Michael McGrady is the contributing editor of AVN.com.

Disclosure: The author is a member of the Free Speech Coalition. He wasn’t compensated by the coalition or its members to write this column.

Filed Under: 1st amendment, adult content, age verification, arkansas, geoblocking, porn, porn license, pornhub, sarah huckabee sanders
Companies: ethical capital partners, mindgeek, pornhub

Texas Legislature Convinced First Amendment Simply Does Not Exist

from the it's-time-for-them-to-learn dept

Tue, Jun 20th 2023 12:06pm - Ari Cohn

Over the past two years, there has been a concerted push by state legislatures to regulate the Internet, the likes of which has not been seen since the late 90s/early aughts. Content moderation, financial relationships between journalists and platforms, social media design and transparency, “national security,” kids being exposed to “bad” Internet speech—you name it, a state legislature has introduced an unconstitutional bill about it. So it’s no surprise that the anti-porn crowd seized the moment to once again exhibit a creepy and unhealthy interest in what other people do with their pants off.

The Texas legislature, also unsurprisingly, was all too happy to help out. Last week, Texas Governor Greg Abbott signed into law HB 1181, which regulates websites that publish or distribute “material harmful to minors,” i.e., porn.

Start from the premise that pornography is protected by the First Amendment, but that it may be restricted for minors where it could not be for adults under variable obscenity jurisprudence.

The law’s requirements applies to any “commercial entity,” explicitly including social media platforms, that “intentionally publishes or distributes material on an Internet website… more than one-third of which” is porn. That’s a problematic criterion in the first place. I don’t know that there’s an easy (or even feasible) way for a social media platform to know precisely how much porn is on it (perhaps there is, though). And what about a non-social media website—what is the denominator? If a website has articles (which is definitely the reason you’re on it, I know) plus naughty pictures, is the percentage calculated by comparing the number of porn-y things to the number of articles? Words? Pages? Who knows—the law sure doesn’t say.

But that’s the least of the law’s problems. HB 1181 requires qualifying entities (however determined) to do two things, both of which clear First Amendment hurdles about as well as a rhinoceros competing in a steeplechase.

Age-Verifying Users

This has been a recurring theme in state and federal legislation recently. HB 1181 requires covered entities to “use reasonable age verification methods” to ensure that users are 18 or older before allowing access.

We’ve been here before, and explaining this over and over again is getting exhausting. But I’ll do it again, louder, for the people in the back.

Age Verification Laws: A Brief History

In the beginning (of the web) there was porn. And the Government saw that it was “icky” and said “let there be laws.”

In 1996, Congress passed the Communications Decency Act, prohibiting the knowing transmission or display of “obscene or indecent” messages to minors using the Internet. A unanimous Supreme Court struck down the law (with the exception of Section 230) in Reno v. ACLU, holding that it chilled protected speech, in part because there was no way for users in chat rooms, newsgroups, etc. to know the age of other users—and even if there was, a heckler’s veto could be easily imposed by

any opponent of indecent speech who might simply log on and inform the would-be discoursers that his 17-year-old child…would be present.

The Court rejected the government’s argument that affirmative defenses for use of age-verification methods (in particular credit card verification) saved the law, noting that not every adult has a credit card, and that existing age verification methods did not “actually preclude minors from posing as adults.”

So Congress tried again, passing the Child Online Protection Act (COPA) in 1998, ostensibly narrowed to only commercial enterprises, and again containing affirmative defenses for using age-verification. Again, the courts were not buying it: in a pair of decisions, the Third Circuit struck down COPA.

With respect to the viability of age verification, the court found that the affirmative defense was “effectively unavailable” because, again, entering a credit or debit card number does precisely nothing to verify a user’s age.

But more importantly, the court ruled that the entire idea of conditioning access to material on a government-imposed age verification scheme violates the First Amendment. Noting Supreme Court precedent “disapprov[ing] of content-based restrictions that require recipients to identify themselves affirmatively before being granted access to disfavored speech,” the Third Circuit ruled in 2003 that age-verification would chill protected speech:

We agree with the District Court’s determination that COPA will likely deter many adults from accessing restricted content, because many Web users are simply unwilling to provide identification information in order to gain access to content, especially where the information they wish to access is sensitive or controversial. People may fear to transmit their personal information, and may also fear that their personal, identifying information will be collected and stored in the records of various Web sites or providers of adult identification numbers.

In its second decision, coming in 2008, the court again agreed that “many users who are not willing to access information non-anonymously will be deterred from accessing the desired information.” And thus, after the Supreme Court denied cert, COPA—and the notion that government could force websites to age-verify users—died.

Until now.

Age Verification Today

Has anything changed that would render these laws newly-constitutional? One might argue that age-verification technologies have improved, and are no longer as crude as “enter a credit card number.” I suppose that’s true in a sense, but not a meaningful one. HB 1181 requires age verification by either (a) a user providing “digital identification” (left undefined), or (b) use of a commercial age-verification system that uses either government-issued ID or “a commercially reasonable method that relies on public or private transactional data.”

It stands to reason that if a minor can swipe a parent’s credit card for long enough to enter it into a verification service, they can do the same with a form of Government ID. Or even easier, they could just borrow one from an older friend or relative. And like entering a credit card number, simply entering (or photographing) a government ID does not ensure that the person doing so is the owner of that ID. And what of verification solutions that rely on selfies or live video? There is very good reason to doubt that they are any more reliable: the first page of Google search results for “trick selfie verification” turns up numerous methods for bypassing verification using free, easy-to-use software. Even the French, who very much want online age-verification to be a thing, have acknowledged that all current methods “are circumventable and intrusive.”

But even assuming that there was a reliable way to do age verification, the First Amendment problem remains: HB 1181 requires adult users to sacrifice their anonymity in order to access content disfavored by the government, and First Amendment jurisprudence on that point has not changed since 2008. Texas might argue that because HB 1181 prohibits websites or verification services from retaining any identifying information, the chilling harm is mitigated. But there are two problems with that argument:

First, on a practical level, I don’t know how that prohibition can work. A Texas attorney general suing a platform for violating the law will have to point to specific instances where an entity failed to age-verify. But how, exactly, is an entity to prove that it indeed did perform adequate verification, if it must delete all the proof? Surely just keeping a record that verification occurred wouldn’t be acceptable to Texas—otherwise companies could simply create the record for each user and Texas would have no way of disproving it.

Second, whether or not entities retain identification information is entirely irrelevant. The chilling effect isn’t dependent on whether or not a user’s browsing history or personal information is ultimately revealed. It occurs because the user is asked for their identifying information in the first place. Few if any users are even likely to even know about the data retention prohibition. All they will know is that they are being asked to hand over ID to access content that they might not want associated with their identity—and many will likely refrain as a result. The de-anonymization to anyone, for any amount of time, is what causes the First Amendment harm.

Technology has changed, but humans and the First Amendment…not so much. Age verification remains a threat to user privacy and security, and to protected First Amendment activity.

Anti-Porn Disclaimers

HB 1181 also requires covered entities to display three conspicuous notices on their home page (and any advertising for their website):

TEXAS HEALTH AND HUMAN SERVICES WARNING: Pornography is potentially biologically addictive, is proven to harm human brain development, desensitizes brain reward circuits, increases conditioned responses, and weakens brain function.

TEXAS HEALTH AND HUMAN SERVICES WARNING: Exposure to this content is associated with low self-esteem and body image, eating disorders, impaired brain development, and other emotional and mental illnesses.

TEXAS HEALTH AND HUMAN SERVICES WARNING: Pornography increases the demand for prostitution, child exploitation, and child pornography.

It’s obvious what Texas is trying to do here. And it’s also obvious what Texas will argue: “The government often forces companies to place warnings on dangerous products, just look at cigarette packages. That’s what we’re doing here too!”

You can likely anticipate what I have to think about that, but it’s worth interrogating in some depth to see exactly why it’s so very wrong.

What Kind of Speech Regulation is This?

Obviously, HB 1181 compels speech. In First Amendment jurisprudence, compelled speech is generally anathema, and subject to strict scrutiny. But the government has more leeway to regulate (or compel) “commercial speech,” that is, non-misleading speech that “does no more than propose a commercial transaction” or “relate[s] solely to the economic interests of the speaker and its audience.

At the outset, I am skeptical that this is a commercial speech regulation. True, it applies only to “commercial entities” (defined effectively as any legally recognized business entity), but speech by a business entity is not ipso facto commercial speech, nor does a profit motive automatically render speech “commercial.” Imagine, for example, that 30% of Twitter content was found to be pornographic. Twitter makes money through its Twitter Blue subscriptions and advertisements. But does that make Twitter as a whole, and every piece of content on it, “commercial speech?” Certainly not. See Riley v. National Federation of Blind, 487 U.S. 781, 796 (1988) (when commercial speech is “inextricably intertwined with otherwise fully protected speech,” the relaxed standards for commercial speech are inapplicable).

And even as applied to commercial pornography websites in the traditional sense1 (presuming that in this application, courts would view the notice requirement as a commercial speech regulation), HB 1181 might be in trouble. In International Outdoor, Inc. v. City of Troy, the Sixth Circuit persuasively reasoned that even commercial regulations are subject to strict scrutiny when they are content based (as HB 1181 plainly is), particularly where they also regulate noncommercial speech (as HB 1181 plainly does). If strict scrutiny is the applicable constitutional standard, the law is certainly dead.

But let’s assume for the sake of argument that we are in Commercial Speech Land, because either way the notice requirement is unconstitutional.

Constitutional Standards for Compelled Commercial Speech

For a commercial speech regulation to be constitutional, it must directly advance a substantial government interest and be narrowly tailored so as not to be more extensive than necessary to further that interest—known as the Central Hudson test.

But there’s another wrinkle: certain compelled commercial disclosures are subjected to the lower constitutional standard articulated in Zauderer v. Office of Disciplinary Counsel. Under Zauderer, compelled disclosures of “purely factual and uncontroversial information” must only “reasonably relate” to a substantial government interest and not be unjustified or unduly burdensome. What type of government interest suffices has been a matter of controversy: Zauderer (and Supreme Court cases applying it) have, on their face, related to remedying or preventing consumer deception in advertising.2 But multiple appellate courts have held that the government interest need not be related to consumer deception.

Would the HB 1181 Receive the More Permissive Zauderer Analysis_?_

Setting aside the question of government interest for just a moment, the HB 1181 notices are clearly not governed by the lower Zauderer standard because in no way are they “purely factual and uncontroversial.”

In 2015, the U.S. Court of Appeals for the D.C. Circuit struck down a regulation requiring (to simplify) labeling of “conflict minerals.” While the origin of minerals might be a factual matter, the court found that the “not conflict free” label was not “non-ideological” (i.e., uncontroversial): it conveyed “moral responsibility for the Congo war” and required sellers to “publicly condemn [themselves]” and tell consumers that their products are “ethically tainted.”

Dissenting, Judge Srinivasan would have read “uncontroversial” as relating to “factual”—that is, disclosures are uncontroversial if they disclose facts that are indisputably accurate. Even under Judge Srinivasan’s more permissive construction, the HB 1181 notices are not factual and uncontroversial. They are, quite simply, standard hysterical anti-porn lobby talking points—some rejected by science and in every other case hotly disputed by professionals and the scientific literature.

And then the Supreme Court decided National Institute of Family & Life Advocates v. Becerra (NIFLA), striking down a California regulation requiring family planning clinics to disseminate a government notice regarding state-provided family-planning services, including abortion—”anything but an ‘uncontroversial’ topic,” the Court noted. In a later case, the Ninth Circuit explained that the notices in NIFLA were not “uncontroversial” under Zauderer because they “took sides in a heated political controversy, forcing [clinics opposed to abortion] to convey a message fundamentally at odds with its mission.”

However you look at it, these notices are not “factual and uncontroversial.” They make claims that are by no means established facts (one might even call them opinions), put the government thumb on the scale in support of them, and force speakers to promote controversial hot-button views that condemn their own constitutionally protected speech. They are simply not the type of disclosures that Zauderer contemplates.

Do the Notices Satisfy the Central Hudson Test?

I’ll admit to hiding the ball a little in order to talk about Zauderer. Regardless of whether Zauderer or Central Hudson controls, the first step of the analysis would remain the same: does the government have a substantial interest?

It seems clear to me that the answer is “no,” so the notice requirement would fail scrutiny either way.

Texas may argue that its interest is “protecting the physical and psychological well-being of minors,” as the federal government asserted when defending the CDA and COPA. While the Supreme Court has held that interest to be compelling, I’m not sure Texas can plausibly claim it here. If the harm to minors comes from viewing porn, but the age verification requirement prevents them from seeing the porn while they are minors, is there a substantial government interest in telling them that the porn they can’t even access is “bad?” To my mind, it doesn’t adequately square. (Admittedly, this may be more of a question of whether the notices “directly advance” the government interest.)

The plain language of the notices evince a much broader theme. To the extent that Texas is trying to protect minors, it seems that it is also trying to protect them from the “harms” of porn even once they are _no longer minors_—that is, to keep them from getting “hooked on porn” ever. In that sense, the notice requirement is aimed as much at adults as it is at minors. The message is clear: porn is harmful and bad—no matter what age you are—and you should abstain from consuming it.

Here’s where Texas will invariably analogize HB 1181 to mandated warning labels on cigarettes. “It’s constitutionally permissible to force companies to label dangerous products, and that’s all we’re doing,” Texas will say. But the government interest there is to reduce smoking rates—thereby protecting consumer and public health from a physical product that definitively causes serious and deadly physical disease.

HB 1181 is different in every respect, by a country mile. Distilled to its core, the government interest that Texas must be asserting is: generally reducing the consumption of protected expression disfavored by a government that considers it psychologically harmful to readers/viewers. HB 1181 seeks to protect citizens not from a product with physical effects,3 but rather, from ideas and how they make us think and feel.4 Can that be any government interest at all, let alone a substantial one?

It’s a startling proposition that would give government the power to shape the contours of public discourse in ways entirely at odds with First Amendment principles. Could the government invoke an interest in protecting the public from the psychological harms of hateful speech and demand that any commercial entity distributing it affix a warning label dissuading readers from consuming it? What about the damaging effects (including on health) of political polarizations? Could the government rely on those harms and force “partisan media” to issue warnings about the dangers of their content? Must gun-related periodicals warn readers that “gun culture” leads to mass shootings at the government’s demand? Or can fashion magazines be forced to tell readers that looking at skinny people causes low self-esteem eating disorders? You get the picture.

Consider New York’s “Hateful Conduct Law,” recently struck down by a federal district court in a challenge brought by Eugene Volokh and and two social media platforms. That law requires any commercial operator of a service that allows users to share content to establish a mechanism for users to complain about “hateful conduct” and post a policy detailing how such reports will be addressed. (Notably, the court rejected New York’s assertion that the law only compelled commercial speech.) While the court ultimately accepted “reducing instances of hate-fueled mass shootings” as a compelling government interest (and then held the law not narrowly tailored), it explained in a footnote that “a state’s desire to reduce [constitutionally protected speech] from the public discourse cannot be a compelling government interest.”

And that is clearly the aim of the HB 1181 notices: to reduce porn consumption. To my mind, this is no different than the Supreme Court’s rejection in Matal v. Tam of a government interest in “preventing speech…that offend[s].” Offense, after all, is a psychological impact that can affect mental well-being. But the First Amendment demands that government stay out of the business of deciding whether protected speech is “good” or “bad” for us.

The wholly unestablished nature of the claims made in HB 1181’s notices also cut against the sufficiency of Texas’s interest. In Brown v. Entertainment Merchants Association, California could not draw a direct link between violent video games and “harm to minors,” so it instead relied on “predictive judgments” based on “competing psychological studies” to establish a compelling government interest. But the Supreme Court demanded more than “ambiguous proof,” noting that the case California relied on for a lower burden “applied to intermediate scrutiny to a content-neutral regulation.” (emphasis in original)

While (presuming again that this is in fact a commercial speech regulation) we may be Intermediate Scrutiny Land, we are also in Unquestionably Content-Based Land—and I think that counts for something. In all respects, HB 1181’s notice requirement is a content-based regulation justified by the (state’s theorized) reaction of listeners. See Boos v. Barry, 485 U.S. 312, 321 (1988) (“[I]f the ordinance…was justified by the city’s desire to prevent the psychological damage it felt was associated with viewing adult movies, then analysis of the measure as a content-based statute would have been appropriate.”). While I am doubtful that Texas can ultimately assert any substantial interest here, at the very least any asserted interest must be solidly supported rather than moralistic cherry picking.

In sum, I do not see how any state interest in reducing the consumption (and thus ultimately proliferation) of entirely protected speech can itself be a legitimate one. By extension, I think that invalidates any government interest in protecting recipients of that speech from the psychological effects of that speech—the entire point of expression is to have some kind of impact. Speech can of course have harmful effects at times, and the government is free to use its own speech, on its own time, to encourage citizens to make healthy decisions. But it can’t force speakers to warn recipients that their speech ought not be listened to.


So why do state legislatures keep introducing and passing laws that are undercut by such clear lines of precedent? The “innocent” answer is that they simply do not care: once they’ve completed the part where they “do something,” they can get the media spots and do the chest-pounding and fundraising—whether the law is ultimately struck down is immaterial. The more sinister answer is that, believing that they have a sympathetic Supreme Court, they are actively manufacturing cases in the hopes that they can remake the First Amendment to their liking. Here’s hoping they fail.


1 In contrast, I think that a porn site that provides content (especially if user-uploaded) for free and relies on revenue from advertising is more akin to Twitter than it is to a pay-for-access site for commercial speech purposes.

2 For a good treatment of the Supreme Court’s Zauderer jurisprudence and analysis of its applicability to content moderation transparency laws, see Eric Goldman, Zauderer and Compelled Editorial Transparency: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4246090

3 Notably, some courts have expressed skepticism (without deciding) that a government could even assert “a substantial interest in discouraging consumers from purchasing a lawful product, even one that has been conclusively linked to adverse health consequences [i.e., cigarettes].

4 Unlike cigarettes, the ideas and expression contained within books, films, music, etc (as opposed to the physical medium) are not considered “products” for products liability purposes, and courts have rejected invitations to hold otherwise on First Amendment grounds. See, e.g., Winter v. G.P. Putnam’s Sons, 938 F.2d 1033 (9th Cir. 1991); Gorran v. Atkins Nutritionals, Inc., 464 F. Supp. 2d 315 (S.D.N.Y. 2006).

Originally posted to Ari Cohn’s Substack.

Filed Under: 1st amendment, adult content, age verification, hb 1181, porn license, texas

Louisiana Law Now Requires Age Verification At Any Site Containing More Than One-Third Porn

from the [whips-out-dipstick-to-check-site's-porn-level] dept

Very few issues have generated as much ridiculous legislation as preventing minors from accessing pornography. Almost everyone agrees something must be done. And most seem to agree that doing anything — no matter how stupid — is better than doing nothing.

Extremely stupid versions of “something” have cropped up around the nation, most of them propelled by a self-proclaimed anti-porn activist who once tried to marry his own computer in protest of gay marriage and has engaged in a number of performative lawsuits, including one against Apple for failing to prevent him from accessing porn on his devices.

Many of these bills have gone nowhere. However, a few have actually become law, providing the legislation’s supporters with some cheap wins that look good on the anti-porn resume, if it they don’t really do much to actually prevent children from accessing explicit content.

A law passed last year in Louisiana has just gone into effect, requiring age verification at sites that meet the state’s watershed for porn content.

The porn industry has been around for a while and in today’s digital age business is booming. When Laurie Schlegel isn’t seeing her patients who struggle with sex addiction, she’s at the Louisiana State Capitol.

The Republican state representative from Metairie passed HB 142 earlier this year requiring age verification for any website that contains 33.3% or more pornographic material.

“Pornography is destroying our children and they’re getting unlimited access to it on the internet and so if the pornography companies aren’t going to be responsible, I thought we need to go ahead and hold them accountable,” said Schlegel.

There’s some weird stuff going on here, likely due to the law [PDF] being about 90% performative nonsense and 10% legalese.

First off, there’s the strangely arbitrary cutoff point of one-third porn content. Unmentioned anywhere is how porn percentage will be determined. Also unmentioned is whether or not the law still applies when the total percentage of porn content dips below 33%.

This language appears to borrowed from the UK’s disastrous porn filter legislation, which proposed the same cutoff line while similarly being vague about how the porn percentage of sites would be determined.

That sets the baseline for enforcement, suggesting a government entity might have to access all available content on a site to determine whether or not it can be held liable (via civil suits brought by residents or the state attorney general) for failing to properly conduct age verification.

But to get to all of this, one first has to wade through a paragraph presumably written by Rep. Schlegel, which supposedly justifies everything that comes after it.

Pornography contributes to the hyper-sexualization of teens and prepubescent children and may lead to low self-esteem, body image disorders, an increase in problematic sexual activity at younger ages, and increased desire among adolescents to engage in risky sexual behavior. Pornography may also impact brain development and functioning, contribute to emotional and medical illnesses, shape deviant sexual arousal, and lead to difficulty in forming or maintaining positive, intimate relationships, as well as promoting problematic or harmful sexual behaviors and addiction.

This sounds a lot like the stuff said by others pushing anti-porn legislation, a lot of it composed by a man who sued Apple for allowing him to access porn. It’s a smokescreen that allows prudish legislators to hide their desire to control what content even adults can consume (by raising state-sponsored barriers) behind statements about concerns for the health and well-being of constituents.

This may be Schlegel’s own writing, however. Her statements to WAFB contain plenty of other absurd assertions.

She said problems like depression, erectile dysfunction, lack of motivation, and fatigue can be directly linked to porn. She also said to prevent these issues from occurring at younger ages, this law is imperative.

It’s tied to some of the biggest societal ills of human trafficking and sexual assault. And in my own practice, the youngest we’ve ever seen is an 8-year-old,” noted Schlegel.

There’s little if anything linking porn to sexual assault. And I don’t know which of these problems the state rep observed in an 8-year-old, but I sincerely hope it wasn’t erectile dysfunction.

The law may prevent sites required to verify the ages of visitors from collecting or storing credentials/personal info used for verification, but the author of the bill thinks the easiest way to verify age is to run it through a verification app created by a private company in partnership with the Louisiana government.

According to Schlegel, websites would verify someone’s age in collaboration with LA Wallet. So, if you plan on using these sites in the future, you may want to download the app.

“I would say so,” said Sara Kelley, project manager with Envoc. “I mean, I think it’s a must-have for anyone who has a Louisiana state ID or driver’s license.”

LA Wallet is a digital drivers license. At the time of its creation, it was the first of its kind in the country. Nudging porn viewers towards state-sponsored apps is all part of the plan. If people believe (correctly or incorrectly) the government may have some way of knowing they’re visiting sites containing at least 33.3% porn, they’re less likely to visit these sites. So, this law may claim it’s for the children, but it’s all about steering people away from content certain legislators don’t like.

It also will nudge sites to more directly police user-generated content for porn to help ensure they don’t inadvertently pass the one-third mark and open themselves up to litigation. The law controls content on both ends of the equation: the distributor and the consumer.

Not that the law is going to actually prevent kids from accessing porn. Plenty of porn can be found on sites not subject to the law. And plenty of porn can be easily accessed even with a state mandate in place. Since most sites affected by this aren’t actually located in Louisiana, they’re under no obligation to verify the ages of users, even if the users are located in this state. And the law creates no demand (nor could it without creating even greater privacy concerns) that sites police incoming internet traffic for users’ locations at the time of access.

It’s all a bunch of performative stupidity that, at best, will encourage stupid, performative people to file stupid, performative lawsuits. And maybe that’s really the end goal: the pointless hassling of tech companies for not being better parents to the children of Louisiana.

Filed Under: adult content, age verification, laurie schlegel, louisiana, porn, porn license

Arizona The Latest To Explore Dumb Porn Filter Law, This Time To Help Fund Trump's Fence

from the filtering-the-naughty-bits dept

Thu, Jan 24th 2019 06:22am - Karl Bode

For some time now, a man by the name of Chris Sevier has been waging a fairly facts-optional war on porn. Sevier first became famous for trying to marry his computer to protest same sex marriage back in 2016. He also tried to sue Apple after blaming the Cupertino giant for his own past porn addiction, and has gotten into trouble for allegedly stalking country star John Rich and a 17-year-old girl. Sevier has since been a cornerstone of an effort to pass truly awful porn filter legislation in more than 15 states under the disingenuous guise of combating human trafficking.

Dubbed the “Human Trafficking Prevention Act,” all of the incarnations of the law would force ISPs to filter pornography and other “patently offensive material.” The legislation would then force state residents interested in viewing porn to pony up a one-time $20 “digital access fee” to whitelist the internet’s naughty bits for each internet-connected device in the home. The proposal is patently absurd, technically impossible to implement, and yet somehow these bills continue to get further than they ever should across a huge swath of the boob-phobic country.

Once people have realized the ignorant futility (and under-handed sales pitch) of such model legislation, it usually fails to gain any steam in most states. But it’s back this week with a decidedly new wrinkle in Arizona, where State Rep. Gail Griffin is pushing Arizona House Bill 2444. HB 2444 would mandate that any Arizona internet user would need to file a request if they want to access porn online, proving they’re at least 18 years of age. Porn seekers would then pay a one-time fee of $20 (plus additional fees) to access porn. Of course since this effort (like past efforts) is technically futile, the proposal is going nowhere.

But it’s getting some extra attention this week because the bill mandates the creation of something called the “John McCain Human Trafficking and Child Exploitation Prevention Fund,” which, if past precedent for these bills holds, likely has less than nothing to actually do with, and was never sanctioned by, the family of John McCain.

That fund, in turn, would go to a number of different causes, including a program designed “to uphold community standards of decency” and develop “programs for victims of sex abuse.” But Arizona’s incarnation of this dumb law has a small wrinkle in that Griffin is trying to claim this money could also be used to help fund Trump’s unnecessary border fence:

“At the top of the list of 10 explicit things the grants can be used for is ?build a border wall between Mexico and this state or fund border security.” Other grant purposes include mental health services, temporary housing, assisting victims, training, assisting school districts and assisting law enforcement. It is unclear if the McCain family is supportive of the legislation or a fund created in the late senator?s name.”

Again though, that funding is never going to happen because this law, like the last fifty times we’ve covered it, isn’t likely to pass. It isn’t likely to pass because filtering porn on such a level is arguably impossible, as we’ve seen every time someone attempts to erect such government-mandated censorship of porn. And it’s not going to pass because the folks behind the draft legislation it’s based on not only have absolutely no idea how the internet actually works, they consistently misrepresent what the law is supposed to actually do (and fund).

But the real story here isn’t the dumb filter, or the Trump wall wrinkle (though both will happily feed the clickbait machine for much of the week). The real story is how successful Sevier has been, despite his very checkered past, at getting more than a dozen state legislatures to mindlessly embrace terrible, unworkable legislation that happily gives a giant middle finger to the Constitution.

Filed Under: arizona, chris sevier, donald trump, filters, gail giffien, human trafficking prevention act, john mccain, porn, porn laws, porn license, wall

UK Gov't To Allow Citizens To Head To Nearest Newsstand To Buy Porn… Licenses

from the so-much-progress dept

The UK government’s continuing efforts to save the country’s children from the evils of internet porn are increasingly ridiculous. Filtering efforts applied by ISPs have managed to seal off access to plenty of non-porn sites while still remaining insanely easy to circumvent. The government — with a straight face — suggested there was nothing not normal about internet customers turning over personal information to ISPs in exchange for the permission to view porn. It’s as if building a database of the nation’s porn aficionados was the government’s original intent.

Since nothing about this was working about the way the porn filter’s architects (one of whom was arrested on child porn charges) imagined, the UK government decided the same non-functioning tech could be put to work filtering out “terrorist content.” Bad ideas have repeatedly been supplanted by worse ones, and now it appears UK citizens may be able to opt out of ISP porn-related data harvesting by [squints at press report] buying a porn license from their local newsjobber.

High street newsagents are to sell so-called “porn passes” that will allow adults to visit over-18 websites anonymously.

The 16-digit cards will allow browsers to avoid giving personal details online when asked to prove their age.

Instead, they would show shopkeepers a passport or driving licence when buying the pass.

Trench coats are coming back! Somewhat of an ironic turn of events, given how much government effort was expended trying to limit the amount of public porn consumption by shutting down theaters and heavily regulating distribution of pornography. Instead of heading to porn shops in shady areas of town, porn consumers will be headed to newspaper kiosks to publicly announce their desire to consume porn in the privacy of their own homes.

I would imagine this will be regulated as well, with the government needing occasional access to porn license buyer lists to verify that newsagents are properly vetting porn license purchasers. Fortunately, the privacy-minded porn fan will now be providing personal info to someone other than their ISP. Unfortunately, they will be providing this to people in their neighborhood, possibly in front of their neighbors.

There is, however, a chance the purchase of a porn license may be treated as no different than a purchase of a pornographic publication: age verification only and no retention of records needed. Given the UK government’s incessant push for a sanitized web, it seems unlikely this will be the case. Once you’ve gotten into the business of controlling access to legal content, the tendency is to continue expansion, rather than treat this as simply as a voluntary exchange between buyer and seller with only very limited government interest.

Filed Under: filters, for the children, internet, porn, porn license, uk

Lawmakers From The Great Theocracy Of Utah Looking To Block Porn On Cell Phones

from the pron dept

When we’ve talked in the past about government attempting to outright block pornography sites, those efforts have typically been aimed at sites hosting child pornography. Blocking child porn is a goal that’s impossible to rebel against, though the methods for achieving it are another matter entirely. Too often, these attempts task ISPs and mobile operators with the job of keeping this material out of the public eye, which is equal parts burdensome, difficult to do, and rife with collateral damage. Other nations, on the other hand, have gone to some lengths to outright block pornography in general, such as in Pakistan for religious reasons, or in the UK for save-the-children reasons. If the attempts to block child porn resulted in some collateral damage, the attempts to outright censor porn from the internet resulted in a deluge of such collateral damage. For this reason, and because we have that pesky First Amendment in America, these kinds of efforts attempted by the states have run into the problem of being unconstitutional in the past.

But, as they say, if at first you don’t succeed, just try it in an even more conservatively prudish state again. Which brings us to Utah, where state Senator Todd Weiler is leading the effort to purge his state of any access to porn on mobile devices.

Utah Senator Todd Weiler has proposed a bill to rid the state of porn by adding Internet filters and anti-porn software on all cell phones and requiring citizens to opt-in before viewing porn online. It’s to save the children, he says. Weiler successfully pushed an anti-porn resolution through the state Senate earlier this year, declaring porn a “public health crisis.” He now hopes to take his movement a step further by making it harder for Utah citizens to have access to digital porn.

“A cell phone is basically a vending machine for pornography,” Weiler told TechCrunch, using the example of cigarettes sold in vending machines and easily accessed by children decades ago.

This is where we’d usually talk about how this sort of thing is almost certainly unconstitutional, not to mention how easily circumvented the attempt would be. And both of those remain true for this case. But I would like to instead focus on the lazy analogies Weiler chooses to make and let them serve as an example of how easily twisted people’s opinions can become if you simply add “saving the children” to the goals of a particular piece of legislation.

Let’s start with the quote above, although I promise you there is more from Senator Weiler that we’ll discuss. He claims that a cell phone is basically a porno vending machine, like a cigarette vending machine. The only problem with his analogy is how wildly untrue it is. A cigarette vending machine has no other purpose than, you know, vending smokes. A cell phone, on the other hand, has a few other purposes. Like playing video games, for instance. Or serving as a music device. Or making god damned phone calls. A claim that a phone is simply a vending machine for porn shows either a tragic misunderstanding of basic technology or, more likely, is simply a veiled hate-bomb at the internet itself. Regardless, it is not upon government to decide how our property is used lawfully. And it isn’t on government to parent children. We have people for that. They’re called parents.

But Weiler wasn’t done.

The senator says England was successful in blocking porn on the Internet. Prime Minister David Cameron pushed legislation through in 2013 requiring U.K. Internet service providers to give citizen’s the option to filter out porn.

The good Senator must have a strange definition for success, because the UK law is easily circumvented, has managed to censor all kinds of educational and informational non-pornography sites and material, and was created by a lovely chap who was later arrested on charges of child pornography himself. If one wishes to draw upon the success of something in order to push his own interests, that something probably shouldn’t be a complete dumpster fire.

Local Utah ISPs are already calling the plan unrealistic and comparing it to censorious governments that I am certain Senator Weiler would recoil from. Not that this matters, I guess, since Senator Weiler fantastically admits that he has no idea how this will all work under his law.

Weiler says he doesn’t know how it would work but just wants to put the idea out there and that his main concern is kids looking at porn.

“The average age of first exposure to hard-core pornography for boys is eleven years old,” he said. “I’m not talking about seeing a naked woman. I’m talking about three men gang-raping a woman and pulling her hair and spitting on her face. I don’t think that’s the type of sex ed we want our kids to have.”

Look, I usually like to back up my rebuttals to these types of things with facts and figures, but I just don’t have them in this case. That isn’t going to stop me from declaring that the average first exposure to pornography is an eleven year old boy seeing exactly three men gang-raping a woman is a line of bullshit so deep that the Utah Senate certainly must provision knee-high boots to its membership for such a thing to even be suggested. And this should tell you everything you need to know about Senator Weiler’s plans: he doesn’t know how successful it’s been elsewhere, he doesn’t know how it works, and he’s willing to sell it to the public on the basis of a scary lie.

Oh, and it’s unconstitutional, so screw your law altogether.

Filed Under: government, mobile phones, porn, porn filters, porn license, todd weiler, utah

ISPs Reporting That UK's Web Filters Being Activated By Less Than 10% Of New Customers

from the more-porn-for-the-rest-of-us! dept

To call the UK’s institution of ISP-level web filters “stupid” isn’t just being blithely dismissive. For one, they don’t work. They block the wrong stuff. They let offensive stuff in. They’re easily circumvented. They’re advance scouts for government censorship. The only people who think web filtering is a good thing are those with the power to turn pet projects into national laws.

Add one more to the list: they’re hugely unpopular.

Broadband customers are overwhelmingly choosing not to use parental-control systems foisted on ISPs by the government – with take-up in the single digits for three of the four major broadband providers…

Only 5% of new BT customers signed up, 8% opted in for Sky and 4% for Virgin Media. TalkTalk rolled out a parental-control system two years before the government required it and has had much better take-up of its offering, with 36% of customers signing up for it.

Those pushing for filters would have you believe it’s something the public has been clamoring for to help them protect their children from the many evils of the internet. In reality, hardly anyone appears to care all that deeply about hooking up to a pre-censored connection.

There’s more than simply unpopularity going on here. The numbers skew low for several reasons. At this point, the rollout isn’t 100% complete and isn’t being offered to every new customer (something that becomes a requirement in 2015). Virgin Media (somewhat ironically) has been hooking customers up with the filthiest internet. Techs for that company have only been presenting the “unavoidable choice” to a little over a third of its new signups. Other ISPs techs have been more thorough, presenting new customers with the option nearly every time.

Many service providers say it’s also possible the filtering has been activated post-installation (Ofcom’s report only tracks filtering enabled at the time of install) or that customers are already using device-based filters.

Despite all of these factors, I wouldn’t expect adoption numbers to rise much. People generally don’t like the government telling them what they can and can’t access. Illegal content is already blocked at ISP level (as well as by several search engines), so what’s being added is nothing more than a governmental parent to watch over citizens’ shoulders as they surf the web. Those with children would probably prefer to run an open pipe and filter content at the device level. Not everyone in a household needs to be treated like a child, which is exactly what these filters (and their proponents) do.

Beyond that, activating a web filter goes against human nature, especially the exertion of free will and the general avoidance of embarrassment. Most people view themselves as “good” and uninterested in the long list of internet vices (porn being the most popular). But even if they truly believe they’d never view this content, they’d rather have it arrive unfiltered than be forced to approach their ISP weeks (or minutes…) later like a bit-starved Oliver Twist and ask, “Please, sir. May I have some porn?”

Filed Under: filters, isps, porn license, uk

As UK Government Considers Opt-Out Porn Censorship, Report Already Finds Overblocking On Mobile Networks

from the surprised?-me-neither dept

A few weeks ago, we noted the UK government was considering plans to bring in an opt-out form of censorship, in what would amount to a kind of porn license, and that such an approach runs the risk of blocking a far wider range of materials. Now the Open Rights Group (ORG) has released a report that shows the “child protection filters” on UK mobile Internet networks are already overblocking sites:

> It shows how systems designed to help parents manage their childrens’ access to the Internet can actually affect many more users than intended and block many more sites than they should. It reveals widespread overblocking, problems with transparency and difficulties correcting mistakes.

The report and an update show that sites affected are found in the realms of digital rights (La Quadrature du Net and the Tor Project), technology (GigaOM, London Ruby User Group and the start-up organization Coadec), lifestyle, community and politics.

As the ORG report highlights, this kind of overblocking does not augur well for any UK government attempts to widen filtering to include fixed-line access:

> If they follow a similar blueprint of ISP level filtering as mobile operators, all the problems we have highlighted would be reproduced at a larger scale. For example, most fixed-line connections are shared by a number of people using a variety of devices. Implementing filtering in that situation would require a range of approaches from whitelisting for young children to censorship-free connections for adults.

What’s rather depressing is that news that overblocking is already taking place is no surprise: it’s simply inevitable when this kind of network-level approach is taken. It underlines again why filtering has to be implemented locally:

> we hope that if the government does pursue such a policy it will be flexible, concentrate on users and devices rather than networks, allow the tools to be properly described as “parental controls” and above all avoid turning on blocking by default.

Despite the mounting evidence of overblocking on mobile networks, it’s not clear if any of those sensible suggestions will be implemented when it comes to fixed-line access — details of the proposed UK legislation have yet to be announced.

Follow me @glynmoody on Twitter or identi.ca, and on Google+

Filed Under: censorship, filters, overblocking, porn license, uk