As Expected, Twitter’s New Trust & Safety Rules Are ‘Elon’s Whims’ (original) (raw)

from the who-could-have-predicted-this? dept

Have you noticed that everything that Elon Musk insisted was “bad” about the old Twitter (often incorrectly) are things… he’s now doing himself, but in even more ridiculous ways? He insisted that Twitter was run by people who were promoting ideological political views. Yet… it was Elon Musk (not old Twitter management) who publicly insisted people should vote for one party in the midterm elections. He insisted that Twitter unfairly blocked accounts based on made up rationale. Yet it was Elon Musk who started making up nonsense rules to ban people who annoyed him. He insisted that “shadowbanning” was bad, but his stated solution to content moderation policies… is the same exact thing he claims (falsely) is “shadowbanning.”

He claimed that Twitter wasn’t transparent or open enough. But the decisions he’s making are done with zero transparency and close off Twitter. For example, the decision to cut off third party apps, some of whom really helped build Twitter into what it had become, was done with no notice, and no explanation. Instead, nearly a week later the company claimed it was “enforcing its long-standing API rules” but then quietly (days later) inserted a new rule to ban 3rd party apps.

But the biggest issue of all, was that Musk (and many others!) have long seemed to believe that moderation decisions on Twitter were driven by Jack Dorsey’s whims. This has always been wrong. The company had in place detailed policies about how to handle trust & safety issues. They weren’t always good policies, and they often needed to be adjusted, but there were policies.

However, now, the decisions are driven purely by Elon’s whims.

Bloomberg has a thorough and eye-opening look at what’s been happening with trust and safety at Twitter these days. And, basically, it’s whatever Elon says, or whatever his hand-picked trust & safety boss, Ella Irwin, says in trying to make Elon happy:

But now, internal documentation shows a decision-making process amounting to little more than unilateral directives issued by Twitter’s new owner. In late November, an account belonging to the leftist activist Chad Loder was banned from the platform. In Twitter’s internal system, a note read, “Suspension: direct request from Elon Musk,” according to a screenshot viewed by Bloomberg. On Dec. 11, Jack Sweeney, the creator of a bot tracking Musk’s private plane, posted a screenshot showing Irwin had sent a Slack message directing employees to restrict visibility to Sweeney’s bot account, @elonjet. On Dec. 15, when Twitter suspended prominent journalists covering Twitter and Musk, the action was accompanied by an internal note: “direction of Ella.”

Twitter used to have a group called the Global Escalations Team that could be a check on power at the top of the company, overruling executives based on existing policy. Employees say that group has folded, and Irwin and Musk can no longer be challenged through a formal process. In her emailed response, Irwin said that was “not accurate at all,” declining to elaborate.

In other words, rather than fix a system of arbitrary and capricious trust & safety content moderation decisions, Musk has created just such a system.

“It’s like Musk is taking all of the content moderation best practice norms the trust and safety community has built up over the past decade and is trying to set them on fire,” said Evelyn Douek, an assistant professor at Stanford Law School. “The entire trend has been towards giving users more transparency, predictability and due process. What Musk is doing is like the antithesis of this.”

There are also some crazy details in the article, including that even before Musk took over, Irwin tried to kill a program that sought to deal with troll spam pushing Chinese Communist Party propaganda:

Irwin and Roth also directly butted heads in the months before he left the company, according to people familiar with the matter. As part of the review of unnecessary projects, she ordered a pause of work Roth oversaw that scanned the social network for spammy actors or people who wished to inject disinformation into the platform, such as those who spread falsehoods favorable to the Chinese Communist Party, according to four former employees. Roth, who was a lateral peer of Irwin’s, bristled at what he saw as overreach by Irwin into crucial processes executed by his team, the people said. Roth overruled her, saying it was essential work, they said.

I also should note that I appreciate how the two excellent reporters on the story, Davey Alba and Kurt Wagner, handle the issue of the Twitter Files, by noting what Musk and the people working on the files purport them to be, followed by what they really show:

On Dec. 8, the writer Bari Weiss posted a Twitter thread that purported to show that company employees had covertly blacklisted accounts and tweets; in reality, the documents she shared showed workers earnestly debating the spirit of their content moderation policies.

The article also debunks the false claims that have made the rounds among Musk supporters that the company didn’t take the issue of child abuse material seriously. The reporters spoke to NCMEC, which would know, since they’re the organization that deals with such reports:

The National Center for Missing and Exploited Children, a federally designated clearinghouse for online child sexual abuse imagery that works with law enforcement agencies, also refuted the idea that Twitter had not taken action on child exploitative content before Musk’s takeover. “It’s been disheartening to see that rhetoric because we had relationships with people that really, truly cared about the issues,” said Gavin Portnoy, a spokesman.

There’s a lot more in the article and it’s worth reading if you want to know what a mess Twitter’s trust & safety practices have become. But, again, the biggest thing that stands out to me is how much of what Elon is doing (badly) are worse versions of what he said was wrong with Twitter in the first place.

Filed Under: arbitrary, content moderation, csam, ella irwin, elon musk, trust & safety, whims
Companies: twitter