permanent suspension – Techdirt (original) (raw)
Stories filed under: "permanent suspension"
The Oversight Board's Decision On Facebook's Trump Ban Is Just Not That Important
from the undue-ado dept
Today is Facebook Oversight Board Hysteria Day, because today is the day that the Facebook Oversight Board has rendered its decision about Facebook’s suspension of Donald Trump. And it has met the moment with an appropriately dull decision, dripping in pedantic reasonableness, that is largely consistent with our Copia Institute recommendation.
If you remember, we were hesitant about submitting a comment at all. And the reaction to the Board’s decision bears out why. People keep reacting as though it is some big, monumental, important decision, when, in actual fact, it isn’t at all. In the big scheme of things, it’s still just a private company being advised by its private advisory board on how to run its business, nothing more. As it is, Trump himself is still on the Internet ? it’s not like Facebook actually had the power to silence him. We need to be worried about when there actually is power to silence people, and undue concern about Facebook’s moderation practices only distracts us from them. Or, worse, leads people to try to create actual law that will end up having the effect of giving others the legal power to suppress expressive freedom.
So our pride here is necessarily muted, because ultimately this decision just isn’t that big a deal. Still, as a purely internal advisory decision, one intended to help the company act more consistently in the interests of its potential user base, it does seem to be a good one given how it hews to our key points.
First, we made the observation that then-President Trump’s use of his Facebook account threatened real, imminent harm. We did, however, emphasize the point that it was generally better to try not to delete speech (or speakers). Nevertheless, sometimes it might need to be done, and in those cases it should be done “with reluctance and only limited, specific, identifiable, and objective criteria to justify the exception.” There might not ultimately be a single correct decision, we wrote, for whether speech should be left up or taken down. “[I]n the end the best decision may have little to do with the actual choice that results but rather the process used to get there.”
And this sort of reasoning is basically at the heart of the Board’s decision: Trump’s posts were serious enough to justify a sanction, including a suspension, but imposing the indefinite suspension appeared to be unacceptably arbitrary. Per the Board, Facebook needs to make these sorts of decisions consistently and transparently from here on out.
On January 6, Facebook?s decision to impose restrictions on Mr. Trump?s accounts was justified. The posts in question violated the rules of Facebook and Instagram that prohibit support or praise of violating events, including the riot that was then underway at the U.S. Capitol. Given the seriousness of the violations and the ongoing risk of violence, Facebook was justified in imposing account-level restrictions and extending those restrictions on January 7. However, it was not appropriate for Facebook to impose an indefinite suspension. Facebook did not follow a clear published procedure in this case. Facebook?s normal account-level penalties for violations of its rules are to impose either a time-limited suspension or to permanently disable the user?s account. The Board finds that it is not permissible for Facebook to keep a user off the platform for an undefined period, with no criteria for when or whether the account will be restored.
The Board has given Facebook six months to re-evaluate the suspension in accordance with clear rules.
If Facebook determines that Mr. Trump?s accounts should be restored, Facebook should apply its rules to that decision, including any modifications made pursuant to the policy recommendations below. Also, if Facebook determines to return him to the platform, it must address any further violations promptly and in accordance with its established content policies.
As for what those rules should be, the Board also made a few recommendations. First, it noted that “political leader” versus “influential user” is not always a meaningful distinction. Indeed, we had noted that Trump’s position cut both ways: as a political leader, there was public benefit to knowing what he had to say. On the other hand, that position also gave his posts greater ability to do harm. The Board for its part noted that context will matter; while the rules should ideally be the same for everyone, since the impact won’t be, it is ok for Facebook to take into account the specific probability of imminent harm in making its decisions.
The Board believes that it is not always useful to draw a firm distinction between political leaders and other influential users. It is important to recognize that other users with large audiences can also contribute to serious risks of harm. The same rules should apply to all users of the platform; but context matters when assessing issues of causality and the probability and imminence of harm. What is important is the degree of influence that a user has over other users.
In general, the Board cited to general principles of human rights law, and specifically the Rabat Plan of Action “to assess the capacity of speech to create a serious risk of inciting discrimination, violence, or other lawless action.” As for how long suspensions should generally last, they should be long enough to “deter misconduct and may, in appropriate cases, include account or page deletion.” Facebook is therefore free to re-impose Trump’s suspension as it re-evaluates it, if it feels it remains warranted. It just needs to do so in a more transparent way that would be scalable to other similar situations. As it summarized:
Facebook should publicly explain the rules that it uses when it imposes account-level sanctions against influential users. These rules should ensure that when Facebook imposes a time-limited suspension on the account of an influential user to reduce the risk of significant harm, it will assess whether the risk has receded before the suspension ends. If Facebook identifies that the user poses a serious risk of inciting imminent violence, discrimination or other lawless action at that time, another time-bound suspension should be imposed when such measures are necessary to protect public safety and proportionate to the risk. The Board noted that heads of state and other high officials of government can have a greater power to cause harm than other people. If a head of state or high government official has repeatedly posted messages that pose a risk of harm under international human rights norms, Facebook should suspend the account for a period sufficient to protect against imminent harm. Suspension periods should be long enough to deter misconduct and may, in appropriate cases, include account or page deletion.
As we suggested in our comment, the right policy choices for Facebook to make boil down to the ones that best make Facebook the community it wants to be. At its core, that’s what the Board’s decision is intended to help with: point out where it appears Facebook has fallen short of its own espoused ideals, and help it get back on track in the future.
Which is, overall, a good thing. It just isn’t, as so many critics keep complaining, everything. The Internet is far more than just Facebook, no matter what Trump or his friends think. And there are far more important things for those of us who care about preserving online expression to give our attention to than this.
Filed Under: appeals, donald trump, permanent suspension, policies, rules, suspension
Companies: facebook, oversight board