Does Taking Down Content Lead Ignorant People To Believe It's More Likely To Be True? (original) (raw)

from the well-that's-a-tough-question dept

Harpers has a giant and fascinating article by Barrett Swanson entitled The Anxiety of Influencers that has received some attention online. Most of the reactions are the kind of typical tut tutting about the existence of TikTok/Instagram influencers whose entire (quite short) careers as “influencers” are based on their ability to get famous on social media for influencing. I do understand why people — especially older folks (a category I now inhabit myself) — look down upon these stories and shake their heads and wonder “what has happened to the children these days?” However, I’m more in the camp of recognizing this kind of thing happens in every generation, and I don’t begrudge kids these days from trying to chase a dream, even if it feels like a silly one to someone not of that generation. There will always be young people chasing dreams, and along with it old people complaining about the kids these days. I don’t think that approach is particularly useful, so I’ll just say that the article is an interesting window into some of the “collab houses” that have sprung up all over (though mostly in LA), full of kids trying to become famous as influencers.

The reason this is here on Techdirt is one tiny bit of the article that touches on content moderation. At one point in the article, Swanson — who deftly alternates between chronicling “the kids these days,” envying some of their fame and attention, and recognizing just how preposterous all of this is — is talking with Chase Zwernemann, who (perhaps somewhat incredibly) is one of the “adults” in the collab house space at a geezerly 21 years old, enabling him to be “VP of talent management” for what he and his colleagues want to suggest is an academy to produce influencers. And Chase appears to have some interesting views about the state of the world, and what he learns online.

Chase, the media liaison and self-described ?influencing professor,? agrees. Later that day, he will tell me that ?we?ve been kind of lucky to have these outlets across the last few months because we?ve been more exposed to what?s really going on.? For instance, just a few weeks ago, he was at home scrolling through his phone as a ritual of pre-sleep entertainment, at which point he stumbled upon ?some kind of documentary? about the apparently rampant levels of Satanism in the U.S. entertainment industry. The documentary offered a detailed exegesis of demonic iconography, which supposedly many directors embed in their TV shows and movies. ?It freaked me out, one hundred percent,? Chase says, ?because I?ve seen those types of things?those signs and symbols?in these entertainment people?s offices, and so then to see this documentary and to start putting the pieces together, I mean, it?s nuts, man.?

At this point, I nonchalantly inquire as to whether Chase could maybe brandish his smartphone and pull up the video in question, and I?m soon made to view something called ?Out of Shadows,? which has been posted on YouTube by an account called?I shit you not?Thinqing QAnon. Later, when I ask Chase whether he?s ever heard about the QAnon conspiracy, he says no, but explains that the video must be legit because ?it?s gotten deleted multiple times off the internet, which is insane.? Epistemologically, this is where we are as a country: when content gets expurgated because of blatant misinformation, it is taken as a sure sign of that source?s truthfulness.

And… frankly… I’m not quite sure how to respond to that. Sure, there’s an element of The Streisand Effect in there, which I understand pretty well. But, this is a slight veering off from the Streisand Effect — assuming that every takedown via content moderation must only be done because of the “hidden truths” the content reveals.

This certainly gets to the heart of some of the cultish conspiracy theory nonsense that goes around these days. In this view, nothing can be proven false, because merely attempting to do so somehow validates it. We’ve seen this before, with other conspiracy theories, but it makes me wonder if the scale is different in this case.

And that then opens up the question of what, if anything, should be done in such a situation. Leaving up blatantly false disinformation that is sucking people in with nonsense and lies is obviously problematic. But so is recognizing that removing the disinformation may lead people to believe in it more strongly. How do you square those two things and come up with a plan to respond? Part of it, obviously, is that different people react to things in different ways. Clearly, young Chase’s reaction to finding out this content keeps getting deleted is not the way everyone (or even most people) will respond. But it’s unclear how many others would fall into that camp. Or what to do about the Chases of this world that are taking information that should be seen as evidence that they’re mainlining disinformation, and interpreting it instead as evidence that the misinformation is true.

To some extent, this brings me back to a point that I’ve been making for years concerning questions of content moderation: we can’t expect “someone else” (government, big companies, journalists, fact checkers, etc.) to solve every problem. That’s just not how it works. To some extent, at some point, there needs to be some personal responsibility and some level of media literacy for the people who consume all this stuff. And clearly we’ve got a long way to go on that front.

Filed Under: content moderation, disinformation, influencers, truth