Common Sense Media Has No Common Sense When It Comes To Internet Laws (original) (raw)
from the common-sense-is-in-short-supply-here dept
Common Sense Media provides some really useful tools if you’re a parent looking to see if certain content is age appropriate. I’ve used it for years. But… also, for years, the organization has been way out over its skis in supporting all sorts of absolutely horrible laws that would do real damage to the internet, to privacy, and to free speech. Over and over again if there’s a bad internet law, Common Sense Media has probably supported it.
For years, it has railed against Section 230, including filing one of the most ridiculous amicus briefs out of the dozens filed in the Gonazalez v. Google case (all of which the Supreme Court effectively ignored while punting the case). It’s supported many laws that attack free expression, including the Age Appropriate Design Code laws in both the UK and California. It regularly pushes for ridiculous laws, including blatantly unconstitutional “protect the children!” laws around the US.
This is somewhat frustrating, because outside of its advocacy for bad laws, Common Sense Media’s actual content offering is actually showing how there’s a better way: by providing parents, teachers, and even kids themselves with more information about how to handle different types of content and what’s appropriate and what’s not. Its main product shows us how empowering people with information is often a much better solution than laws that strip away the rights of people. But then… it advocates for stripping away rights.
The latest such case is in California, where the legislature has just been non-stop pushing terrible, terrible internet laws “for the children.” We recently talked about SB 680 (which Common Sense Media supports!), which was pushed based on a Senator completely misreading an already junk science study. It appears that 680 may actually be dead in California, but instead, the legislature is pushing another problematic bill, and Common Sense Media has lost every last bit of common sense about it.
AB 1394 is yet another “protect the children on the internet” bill, that again rushes to broadly attack the internet, and create all sorts of problems without understanding any of the issues, or how this bill will make things way worse, not better. You can think of AB 1394 as California’s special version of FOSTA, saying that if a website “knowingly facilitates, aids, or abets” child sexual abuse material (CSAM), then there’s a private right of action, allowing people to sue the company for statutory damages of 1millionto1 million to 1millionto4 million.
There are all sorts of issues with this, with the first being that anyone who knows anything about how the internet works will explain to you that any website that allows user generated content will, at some point, be used for CSAM. It’s a constant fight. Of course, federal law already has provisions on how to deal with this, requiring companies to report such content, as the sites become aware of it, to NCMEC and follow some pretty specific procedures to allow law enforcement to do its thing (which it rarely seems to actually do) to try to help victims.
But, once you put massive civil liability on top of that for “knowingly” doing something, YOU ARE TELLING COMPANIES TO STOP TRYING TO HELP. Because the way you avoid “knowing” something is to never, ever look, and never do anything to try to learn about CSAM on the platform. This bill literally incentivizes companies to do way less in the fight against CSAM, because the more they do, the more they risk “knowing” that CSAM is on their platform, and the greater the liability.
And the law is even worse than that. It says that any “system, design, feature, or affordance” on a website that is a “substantial factor” in enabling sexual exploitation of minors, counts as “facilitating, aiding, or abetting.” And that’s true, even if that feature has perfectly legitimate purposes. I’m reminded of various cases where websites are accused of “aiding and abetting” or inducing law breaking activity, because they have a “search” engine on their website. That was part of what got early file sharing services killed. Of course in the case of Megaupload, the company was attacked for not having a search engine, claiming this is how it tried to “hide” its allegedly illegal activity. In other words, having a feature and not having that same feature can be used by creative lawyers to attack basically any company. This is exceptionally broad and will lead to all sorts of ridiculous lawsuits from lawyers eager to get that multi-million dollar statutory payout.
We’re already seeing that happen with FOSTA, and AB 1394 is way broader than FOSTA.
And… the law gets worse. Rather than just mimicking the already problematic parts of FOSTA, AB 1394 also takes some of the very worst ideas from Congress’ EARN IT bill (which, thankfully, still isn’t law), which had the problem of potentially making websites state actors, by demanding that they search for CSAM. As many legal experts explained with EARN IT, this is a massive problem, because once a platform becomes a state actor, the 4th Amendment applies, and now you have to deal with having to get a warrant to do many of these searches, that, otherwise, a private platform would have been able to do by itself.
What that means is that, in practice, a ton of CSAM evidence will not be usable in court, because it will have been obtained in violation of the 4th Amendment.
Once again, the bill would make the fight against CSAM significantly more difficult. Defenders of AB 1394 will claim that it doesn’t require proactive scanning for all CSAM, but that’s wrong. What it does require is the scanning for and blocking of all instances of any reported images. So once a site learns of an image, it has to find and block all copies of it, or violate the law. And, in doing so, this introduces that massive 4th Amendment problem that would make it much more difficult to obtain useful evidence against those creating and distributing CSAM.
And that’s doubly stupid, because basically any of the sites that actually matter already use tools like PhotoDNA and others to find, block, and report known CSAM. When they do it by themselves, it’s not a 4th Amendment issue. As soon as the law requires it, it becomes one, and screws up the ability to make use of that evidence.
There’s more to it, but this is yet another terrible bill, put together by people who think they’re saving the children, when they don’t understand how anything actually works, and how their bill interacts with the real world.
And, of course, Common Sense Media supports it. Not only does it support it, but it’s gone on the offensive, trying to drum up a nonsense story about how big tech supports “pedophilia, bestiality, trafficking, and child sex abuse.” Really. Here’s Common Sense Media founder and CEO Jim Steyer, claiming that any effort to block or even fix this problematic bill means you support those things:
The press release Common Sense put out is ridiculously misleading, to the point of being just out and out disinformation. Here’s how it starts:
The Wall Street Journal recently reported that Instagram, owned by Meta (a member of TechNet), is actively connecting a vast online network of pedophiles and human traffickers to disturbing imagery and videos of child sex abuse. TechNet is an accomplice in Meta’s attempt to gut AB 1394, a bill that aims to hold social media companies accountable for commercial sexual exploitation of children.
It’s interesting that they point to the Wall Street Journal article to make those claims, rather than the underlying report it’s based on. Perhaps because that report is actually way more nuanced than Common Sense wants to admit. The report is not about known instances of CSAM, but rather about the much more difficult to detect “self-generated” CSAM (i.e., newly generated CSAM that is created and then posted by the child themselves, which is a horrific situation, and a massive challenge). The report actually highlights the difficulty of dealing with this issue on Instagram, and nowhere among its recommendations on how to deal with this is anything associated with AB 1394. Because AB 1394 would make the problem significantly worse, not better. It would make it more difficult for Instagram to find and remove this content, and it would make it far more difficult for law enforcement to use such material as evidence.
But, Common Sense connects the two anyway, and then sent this obnoxiously stupid letter to TechNet members:
You are a member of TechNet. And TechNet, on your behalf, is lobbying to stop a bill in Sacramento, AB 1394, that would specifically crack down on child sex trafficking, child pornography, and child sexual abuse, including bestiality, to engage sexual predators and pedophiles online. This is being done in your companies’ names.
Why do innovative and thoughtful leaders like yourselves want to attach your brand and reputation with a company like Meta in opposing a bill to stop child sex trafficking? We are calling on you to renounce TechNet’s opposition and amendments that aim to gut this incredibly important bill for children and families authored by Asm Buffy Wicks. We are calling on you to either leave TechNet or publicly denounce TechNet’s duplicitous behavior and partnership with Meta.
Except, again, AB 1394 would not, in fact, crack down on those things. As explained above, it would make it way more difficult to do so, and create some huge legal problems for both the platforms and law enforcement seeking to actually address the problems.
Once again, Common Sense Media has no common sense.
TechNet responded to the letter and it’s a pretty masterful letter, highlighting not just the problems of the bill in actually achieving its stated goals, but noting that it has actually been working to fix those problems, while Common Sense Media and Jim Steyer just post stupid shit online.
Our commitment, and our member companies’ commitment, to fighting back against sexual predators is crystal clear: the internet, and any platforms on it, should not be a safe haven for these activities and criminals should be prosecuted to the fullest extent of the law.
Since TechNet has spent the last several months working in good faith in pursuit of these objectives by negotiating amendments to this bill with the author and sponsors, we will assume your press release, letter, and social media post were the product of ignorance rather than malice.
Allow us to help you catch up on the latest with respect to your co-sponsored legislation:
•On June 26, TechNet approached Assemblymember Wicks with good faith amendments that would 1) make AB 1394 workable from both a legal and policy perspective and 2) result in the removal of more child pornography. We asked to sit down and negotiate, expressed a clear goal of negotiating industry-wide neutrality, and proposed a series of meetings to work in that direction.
• Since that time, we’ve had more than a dozen discussions and meetings with the author, sponsors, and other key legislative personnel, and those meetings have resulted in substantial progress. We know your organization didn’t participate in those negotiations, so perhaps you’re unaware of the details.
• During that process we have offered amendments that, if accepted, would result in the strongest piece of legislation in the country related to the removal of child pornography from the internet. We’ve made numerous concessions and focused our efforts on providing sound policy alternatives that will result in more child pornography being removed from the internet. For example, despite our strong opposition to increased civil liability, our amendments do not aim to strike the two private rights of action against platforms that fail to comply with the bill.
• Importantly, our amendments attempt to protect AB 1394 from likely First and Fourth Amendment challenges that could invalidate the bill or help perpetrators keep evidence out of court to avoid prosecution for their abhorrent crimes. The last thing TechNet or our members want is for a criminal defendant to be able to overturn their conviction based on evidence collected as a result of this bill, and we hope Common Sense would agree.
• As of last week, we were a few minor details away from an agreement that would remove our opposition. Unfortunately, organizations like yours have decided to upend major points of agreement and are knowingly pushing away from collaboration and toward litigation.
So Common Sense wants a bill that will create massive 1st and 4th Amendment problems, push companies to look the other way to avoid “knowingly” doing stuff to prevent CSAM… and when TechNet tries to fix those issues, Common Sense and its CEO go public attacking them as supporting CSAM.
Once again, Common Sense Media has no common sense.
Filed Under: 1st amendment, 4th amendment, ab 1394, california, csam, facilitation, jim steyer, knowingly, knowledge
Companies: common sense media