Last Thursday, the Oversight Board, created to preside over Facebook's decisions, upheld its ban of former President Donald Trump for inflammatory comments made during the Capitol Hill riots.
The board's decision acknowledges the potential for harm that comments made on social media can pose, particularly when those remarks can be interpreted as supportive of violence or disorder.
The panel of experts that upheld the ban has a background in journalism, law, academia, and human rights. And while it stood behind Facebook's judgment to restrict Trump, it also ruled that the company had violated its own policies in making the ban indefinite. This is an important distinction.
Most people agree on the need for online moderation and are in favor of social media platforms acting quickly to remove inflammatory posts with the potential to cause harm. That said, the capacity for Facebook to act in an ad hoc manner, not transparently defined in its policies, could create a dangerous precedent.
If Facebook could ban Trump indefinitely – with no coherent basis in its policies – might Facebook decide to cull other accounts in the future? And could it potentially use what the board described as an 'indeterminate and standardless' decision-making capacity to exert political influence?
Facebook Oversight Board
Questions surrounding the need for online moderation are complex because they inevitably influence freedom of speech and citizens' protected first amendment rights.
By setting up an external oversight board to help it make significant judgments, Facebook had hoped to pass some of its responsibilities to an autonomous panel of trustworthy intellectuals.
This self-contained entity, Facebook hoped, would allow it to justify and add authority to its decisions, ultimately absolving Zuckerberg of accountability and allowing him to claim that any disputed decisions weren't Facebook's fault.
Unfortunately for Zuck, onlookers have already seeded doubts about the independence of a board whose 20 members were cherry-picked by Facebook. And, despite claims to the contrary from board members like Alan Rusberg, there was always going to be some rhetoric regarding the potential for trial by private corporation.
Back to the drawing board
Members of the Oversight Board appear to have been cognizant of this potential grievance, and were apt enough to put the ball back in Facebook's court, instead of deciding on its behalf:
In applying a vague, standardless penalty and then referring this case to the board to resolve, Facebook seeks to avoid its responsibilities. The board declines Facebook's request and insists that Facebook apply and justify a defined penalty.
This is interesting because it is almost certainly not the outcome Facebook was hoping for. Admittedly, some might consider the board's decision a cop-out, or a failure to force Facebook's hand in one way or the other.
As a result, the outcome gives Facebook the opportunity to self-determine, arguably putting it back in the same position it was in before going to the board for advice, albeit with clear expectations and a time limit.
What will Facebook do next?
By ruling that Facebook was justified in banning Trump – but wrong to make that ban indefinite – Facebook now has two possible routes forward. Trump's indefinite ban violated Facebook's own policies, which means that the company must now decide on, justify, and prescribe a penalty that can apply to all users equally in the future.
As a result, Facebook must now either justify a temporary ban commensurate with the gravity of the violation, and in line with its existing policies, or decide on updates to its policies that frame a permanent ban within the justification of the potential for future harm.
If it reaches that decision, it will again open the door to criticisms surrounding the potential for partisanship, and fears over the restriction of free speech. A tension that Facebook could turn to the board to justify.
For now, we must wait for Facebook to make its choice. In the meantime, we can at least be grateful that Facebook is being forced to clarify its guidelines and policies.
This is favorable to a precedent in which Facebook can improvise decisions on the fly – potentially not to do what is best for society – but what is better for its bottom line.
That said, Facebook is still being allowed to make the decision itself, and will no doubt choose to do what best aligns with its business model.