gonzalez – Techdirt (original) (raw)
Techdirt Podcast Episode 345: The Supreme Court Takes On 230
from the big-decisions dept
After all these years, the Supreme Court is finally weighing in on Section 230 in the Gonzalez and Taamneh cases, and the outcome could have a very significant impact. Our organization, the Copia Institute, filed an amicus brief in the case, as did many other parties. This week, we’re joined by Jess Miers from the Chamber of Progress and lawyer Cathy Gellis (who wrote our amicus brief), both of whom attended the Gonzalez hearing in person, to discuss the status of both cases and what they could mean for the future of the internet.
Follow the Techdirt Podcast on Soundcloud, subscribe via Apple Podcasts or Spotify, or grab the RSS feed. You can also keep up with all the latest episodes right here on Techdirt.
Filed Under: cathy gellis, content moderation, gonzalez, jess miers, section 230, supreme court, taamneh
Supreme Court Takes Section 230 Cases… Just Not The Ones We Were Expecting
from the well,-this-is-not-great dept
So, plenty of Supreme Court watchers and Section 230 experts all knew that this term was going to be a big one for Section 230… it’s just that we all expected the main issue to be around the Netchoice cases regarding Florida and Texas’s social media laws (those cases will likely still get to SCOTUS later in the term). There were also a few other possible Section 230 cases that I thought SCOTUS might take on, but still, the Court surprised me by agreeing to hear two slightly weird Section 230 cases. The cases are Gonzalez v. Google and Twitter v. Taamneh.
There are a bunch of similar cases, many of which were filed by two law firms together, 1-800-LAW-FIRM (really) and Excolo Law. Those two firms have been trying to claim that anyone injured by a terrorist group should be able to sue internet companies because those terrorist groups happened to use those social media sites. Technically, they’re arguing “material support for terrorism,” but the whole concept seems obviously ridiculous. It’s the equivalent of the family of a victim of ISIS suing Toyota after finding out that some ISIS members drove Toyotas.
Anyway, we’ve been writing about a bunch of these cases, including both of the cases at issue here (which were joined at the hip by the 9th Circuit). Most of them get tossed out pretty quickly, as the court recognizes just how disconnected the social media companies are from the underlying harm. But one of the reasons they seem to have filed so many such cases all around the country was to try to set up some kind of circuit split to interest the Supreme Court.
The first case (Gonzalez) dealt with ISIS terrorist attacks in Paris in 2015. The 9th Circuit rejected the claim that Google provided material support to terrorists because ISIS posted some videos to YouTube. To try to get around the obvious 230 issues, Gonzalez argued that YouTube recommended some of those videos via the algorithm, and those recommendations should not be covered by 230. The second case, Taamneh, was… weird. It has a somewhat similar fact pattern, but dealt with the family of someone who was killed by an ISIS attack at a nightclub in Istanbul in 2017.
The 9th Circuit tossed out the Gonzalez case, saying that 230 made the company immune even for recommended content (which is the correct outcome) but allowed the Taamneh case to move forward, for reasons that had nothing to do with Section 230. In Taamneh, the district court initially dismissed the case entirely without even getting to the Section 230 issue by noting that Taamneh didn’t even file a plausible aiding-and-abetting claim. The 9th Circuit disagreed, said that there was enough in the complaint to plead aiding-and-abetting, and sent it back to the district court (which could then, in all likelihood, dismiss under Section 230). Oddly (and unfortunately) some of the judges in that ruling issued concurrences which meandered aimlessly, talking about how Section 230 had gone too far and needed to be trimmed back.
Gonzalez appealed the issue regarding 230 and algorithmic promotion of content, while Twitter appealed the aiding and abetting ruling (noting that every other court to try similar cases found no aiding and abetting).
Either way, the Supreme Court is taking up both cases and… it might get messy. Technically, the question the Supreme Court is asked to answer in the Gonzalez case is:
Whether Section 230(c)(1) of the Communications Decency Act immunizes interactive computer services when they make targeted recommendations of information provided by another information content provider, or only limits the liability of interactive computer services when they engage in traditional editorial functions (such as deciding whether to display or withdraw) with regard to such information.
Basically: can we wipe out Section 230’s key liability protections for any content recommended? This would be problematic. The whole point of Section 230 is to put the liability on the proper party: the one actually speaking. Making sites liable for recommendations creates all of the same problems that making them liable for hosting would — specifically, requiring them to take on liability for content they couldn’t possibly thoroughly vet before recommending it. A ruling in favor of Gonzalez would create huge problems for anyone offering search on any website, because a “bad” content recommendation could lead to liability, not for the actual content provider, but for the search engine.
That can’t be the law, because that would make search next to impossible.
For what it’s worth, there were some other dangerously odd parts of the 9th Circuit’s Gonzalez rulings regarding Section 230 that are ripe for problematic future interpretation, but those parts appear not to have been included in the cert petition.
In Taamneh, the question is focused on the aiding and abetting question, but ties into Section 230, because it asks if you can hold a website liable for aiding and abetting if they try to remove terrorist content but a plaintiff argues they could have been more aggressive in weeding out such content. There’s also a second question of whether or not you can hold a website liable for an “act of intentional terrorism” when the actual act of terrorism had nothing whatsoever to do with the website, and was conducted off of the website entirely.
(1) Whether a defendant that provides generic, widely available services to all its numerous users and “regularly” works to detect and prevent terrorists from using those services “knowingly” provided substantial assistance under 18 U.S.C. § 2333 merely because it allegedly could have taken more “meaningful” or “aggressive” action to prevent such use; and (2) whether a defendant whose generic, widely available services were not used in connection with the specific “act of international terrorism” that injured the plaintiff may be liable for aiding and abetting under Section 2333.
These cases should worry everyone, especially if you like things like searching online. My biggest fear, honestly, is that this Supreme Court (as it’s been known to do) tries to split the baby (which, let us remember, kills the baby) and says that Section 230 doesn’t apply to recommended content, but that the websites still win because the things on the website are so far disconnected from the actual terrorist acts.
That really feels like the kind of solution that the Roberts court might like, thinking that it’s super clever when really it’s just dangerously confused. It would open up a huge pandora’s box of problems, leading to all sorts of lawsuits regarding any kind of recommended content, including search, recommendation algorithms, your social media feeds, and more.
A good ruling (if such a thing is possible) would be a clear statement that of course Section 230 protects algorithmically rated content, because Section 230 is about properly putting liability on the creator of the content and not the intermediary. But we know that Justices Thomas and Alito are just itching to destroy 230, so we’re already down two Justices to start.
Of course, given that this court is also likely to take up the NetChoice cases later this term, it is entirely possible that next year the Supreme Court may rules that (1) websites are liable for failing to remove certain content (in these two cases) and(2) websites can be forced to carry all content.
It’ll be a blast figuring out how to make all that work. Though, some of us will probably have to do that figuring out off the internet, since it’s not clear how the internet will actually work at that point.
Filed Under: aiding and abetting, algorithms, gonzalez, isis, recommendations, section 230, supreme court, taamneh, terrorism, terrorism act
Companies: google, twitter