Facebook and Trump
Facebook's Oversight Board has finally issued its long-awaited ruling on the company's decision to "indefinitely suspend" former President Trump after his statements during and following the January 6 riot at the U.S. Capitol.
At the time, Facebook said it believed "the risks of allowing President Trump to continue to use our service during this period are simply too great." As a result, the company blocked Trump's access to Facebook and Instagram "for at least the next two weeks."
Then, two weeks later, it decided it would refer the matter to the Oversight Board it set up to review content moderation decisions on the platform. That Board, according to Facebook, was established "to make the final call on some of the most difficult content decisions Facebook makes."
Banning the President of the United States would certainly qualify as a "difficult content decision." Although, as the Board points out, Facebook didn't actually make a difficult content decision in this case. It didn't really make a decision at all. Instead, it imposed an indefinite suspension, and then asked someone else to make the call, a point the Board made, while upholding the original suspension.
The Board has upheld Facebook's decision on January 7, 2021, to restrict then-President Donald Trump's access to posting content on his Facebook page and Instagram account... However, it was not appropriate for Facebook to impose the indeterminate and standardless penalty of indefinite suspension.
The ruling goes on to say that Facebook has six months to decide what an appropriate penalty is based on the severity of the circumstances. That could very well include a permanent ban, but if Facebook wants Trump off its platform, it has to be willing to stand up and say just that. That's actually an important point, and it's one that shouldn't be lost in all of the reporting that the Board "upheld" Facebook's ban.
In applying a vague, standardless penalty and then referring this case to the Board to resolve, Facebook seeks to avoid its responsibilities. The Board declines Facebook's request and insists that Facebook apply and justify a defined penalty.
See, I would argue the Board didn't really "uphold" the ban. You can't uphold something and also call it vague and standardless. Instead, the Board upheld the original decision to restrict Trump, but then issues a devastating criticism of Facebook's complete lack of willingness to make a difficult content decision.
"Facebook seeks to avoid its responsibilities."
In fact, with just those six words, the Oversight Board highlighted everything that's wrong with Facebook.
Facebook absolutely should have the right to set rules and guidelines about how people share content on its platform. Obviously, things get complicated when the people sharing content are world leaders and incite their followers to an insurrection. Still, Facebook should be able to impose penalties when people violate those rules.
The problem is, as the Board points out, Facebook didn't even follow its own rules--and didn't actually impose a penalty. It would be as if a Judge found you guilty of doing something wrong, sent you to prison indefinitely, and then asked the Court of Appeals to decide how long you should stay. That's just not how it's supposed to work.
It's definitely not how its supposed to work when you're the world's largest social media platform with influence over the lives of billions of people every day. In the words of Peter Parker's Uncle Ben, "With great power comes great responsibility."
The thing is, that idea didn't originate with a fictional comic book character. You'll find the same sentiment in writings from the French National Convention in 1793 ("They must consider that great responsibility follows inseparably from great power").
You'll even find it in the Bible, in the Gospel of Luke: "Much will be required of everyone who has been given much. And even more will be expected of the one who has been entrusted with more." (Luke 12:28).
My point is, the Board got this part exactly right. Whether Facebook should have banned Trump or not, it should have made a decision, justified its reasons, and stood by it. Instead, it passed the buck to an arbitrary group it set up so that it could avoid the tough decisions.
Facebook's Trump Problem Just Got Much Worse
I stand by that take, but I think there was something more important that I mostly overlooked. Take a look at this passage, the one where that quote about responsibility comes from:
In applying a vague, standardless penalty and then referring this case to the Board to resolve, Facebook seeks to avoid its responsibilities. The Board declines Facebook's request and insists that Facebook apply and justify a defined penalty.
I missed a word in the second sentence of that quote. I missed the part where the Board says that Facebook must "justify a defined penalty." That one word, "justify," is important. It isn't enough that Facebook articulates whether Trump will be permanently banned, it must explain why that's appropriate.
It's important because once Facebook makes a decision about Trump, that decision is likely to be appealed back to the Board. I'd argue that inherent in its command is an understanding that the Board expects to be reviewing the final penalty.
In insisting that Facebook produce a justification for a "defined penalty," the Board is putting it on notice that it will be carefully considering whether or not the punishment fits the--in this case--literal crime.
This might be the worst-case scenario for Facebook.
Despite its statement that "we believe our decision was necessary and right, and we're pleased the board has recognized that the unprecedented circumstances justified the exceptional measure we took," this is actually bad news for Facebook.
Facebook was counting on the Oversight Board to take this out of its hands. At the least, it was counting on the fact that if the Board ruled Facebook had to allow Trump to return, it would have no choice. It would wash its hands of responsibility for what happened next.
Or, if the Board ruled Trump should stay banned, it would give cover to Facebook's decision. It would be able to look at critics and say "hey, this independent group says he has to go. We know you disagree but you'll have to take it up with them."
But, the Board did no such thing. It didn't give Facebook cover. It's entirely Facebook's problem now. In fact, it's likely a much bigger problem than before.
If Facebook thought its decision might tamp down efforts on both sides of the political spectrum to take a more aggressive approach to regulate how social media platforms moderate content, this isn't going to help. In fact, it will surely move that needle in the opposite direction.
You only had to turn on cable news or scroll through Twitter to see the outrage among Trump's defenders in Congress, along with calls to regulate Facebook, break it apart, or eliminate section 230.
Talk about reopening a can of worms Facebook did not want anything to do with. Six months is a long time, and I'm sure Facebook doesn't want this to drag out. It wouldn't surprise me if Mark Zuckerberg is sitting somewhere thinking that there's no decision that doesn't make a lot of people very angry.
Then again, that's what comes with running the world's largest social media platform, which affects the daily lives of more than 2 billion people. If nothing else, it seems pretty clear that this is going to get worse long before it gets better.
Related Stories
Facebook Oversight Board's Trump Decision Is a Circus - The Atlantic This is new, and this is bizarre.
Opinion | Good Riddance to Trump on Facebook? - The New York Times — www.nytimes.com A Facebook-appointed panel avoided a clear decision about Trump’s heinous online behavior. It’s kind of perfect, actually.
Facebook has no reason to ever resolve the Trump ban - The Verge — www.theverge.com The Oversight Board ruled that former President Donald Trump can stay on the platform, but gave Facebook six months to justify the decision in a new policy. But will the ban ever be fully resolved?