New corporate ‘Treaty Body’ gears up to consider Facebook’s decision to bar Donald Trump

by Marc Limon, Executive Director of the Universal Rights Group Misinformation, fake news, and hate speech, Thematic human rights issues

As reported in the New York Times, its members include two people who were reportedly on presidential shortlists for the US Supreme Court, a Yemeni Nobel Peace Prize laureate, a British Pulitzer Prize winner, a former UN Special Rapporteur, Colombia’s leading human rights lawyer, and a former prime minister of Denmark. Welcome to the Facebook Oversight Board, operational since October 2020 (Mark Zuckerberg originally floated the idea back in 2018), and responsible for independently overseeing Facebook’s content moderation decisions.

Last week this new type of oversight body – think private sector treaty body or supreme court – delivered the first of its decisions following a review of appeals against blocked or removed content. Those decisions (on five user appeals and one referral from Facebook – one was subsequently withdrawn), briefly considered below, presage a far more momentous decision due in the next three months – namely, should Donald Trump be permitted to return to Facebook and reconnect with his millions of followers?

That decision will have major consequences not just for the right to freedom of expression and defining permissible limits thereto, for managing the growing threat of ‘hate speech’ and disinformation (‘fake news’) in the digital age, and for American politics, but also for the way in which social media is regulated, and human rights policed.

The Board reaches its decisions guided by Facebook’s own ‘Community Standards’ as well as international human rights law – especially the International Covenant on Civil and Political Rights (ICCPR), and related reports by UN experts (e.g., the Special Rapporteur on freedom of expression, and the members of the Human Rights Committee). The board is structurally independent, and Mr. Zuckerberg has promised its decisions will be binding.

The sheer power and reach of Facebook (2.7 billion users), together with the potential of social media for both good (e.g., promoting freedom of expression, freedom of information) and ill (e.g., incitement to hatred and violence, the spread of disinformation and conspiracy theories), the pioneering nature of Facebook’s approach, and the large amounts of money being paid (via a trust) to the Board’s members and staff (the New York Times reports that each board member is paid a six figures sum annually for roughly 15 hours of work a week), mean that the members and the secretariat (which includes two former senior employees of Article 19, a free speech NGO), have an enormous responsibility to get their decisions right.

The first five cases

Overall, the Board chose to overturn Facebook’s decision to remove content in four out of the five cases it considered. As a result, Facebook must restore those four posts.

Hate speech against Muslims? (Case 2020-002-FB-UA)

On October 29, 2020, a user in Myanmar posted in a Facebook group in Burmese. The post included two widely shared photographs of a Syrian toddler of Kurdish ethnicity who drowned attempting to reach Europe in September 2015.

The accompanying text stated that there is something psychologically wrong with Muslims. It questioned the lack of response by Muslims to the treatment of Uyghur Muslims in China, compared to killings in response to cartoon depictions of the Prophet Muhammad in France. The post concluded that recent events in France reduced the user’s sympathies for the depicted child, and seemed to imply the child may have grown up to be an extremist.

Facebook removed this content under its Hate Speech Community Standard.

The Oversight Board overturned Facebook’s decision to remove the post. It found that, while the post might be considered offensive, it did not reach the threshold to be considered as ‘hate speech.’

In this case, the Board’s reasoning appears sound from a human rights perspective. The user does not appear to be attempting to justify terrorist acts or the death of the young child. Rather, he/she is attempting (no matter how offensively) to draw attention to the alleged hypocrisy of Muslims’, or Muslim States’, outrage at the death of a Muslim refugee and their perceived silence (or even condonement) at the treatment of Uyghurs in China and/or at the killings in France.

Hate speech against Azerbaijanis? (Case Decision 2020-003-FB-UA)

In November 2020, a user posted content which included historical photos described as showing churches in Baku, Azerbaijan. The accompanying text (in Russian) claimed that Armenians built Baku and that this heritage has been destroyed. The user used the term ‘тазики’ (‘taziks’) to describe Azerbaijanis, who the user claimed are nomads and have no history compared to Armenians.

The user included hashtags in the post calling for an end to Azerbaijani aggression and vandalism. Another hashtag called for the recognition of Artsakh, the Armenian name for the Nagorno-Karabakh region, which is at the centre of a conflict between Armenia and Azerbaijan. The post received more than 45,000 views and was posted during the recent armed conflict between the two countries.

The Oversight Board upheld Facebook’s decision to remove the post for violating its Community Standard on Hate Speech.

According to the Board, the term ‘taziks,’ while can be translated literally from Russian as ‘wash bowl,’ can also be understood as a wordplay on the Russian word ‘азики’ (‘aziks’), a derogatory term for Azerbaijanis which features on Facebook’s internal list of slur terms. Independent linguistic analysis commissioned on behalf of the Board confirmed Facebook’s understanding of ‘тазики’ as a dehumanising slur attacking national origin.

Comparison of Joseph Goebbels and Donald Trump (Case Decision 2020-005-FB-UA)

In October 2020, a user posted a quote which was incorrectly attributed to Joseph Goebbels, the Reich Minister of Propaganda in Nazi Germany. The quote, in English, claimed that, rather than appealing to intellectuals, arguments should appeal to emotions and instincts. It stated that truth does not matter and is subordinate to tactics and psychology. There were no pictures of Joseph Goebbels or Nazi symbols in the post. In a statement to the Board, the user said that their intent was to draw a comparison between the sentiment in the quote and the Presidency of Donald Trump.

Facebook removed the post because the user did not make clear that they shared the quote to condemn Joseph Goebbels, to counter extremism or hate speech, or for academic or news purposes.

The Oversight Board overturned Facebook’s decision to remove a post which the company claims violated its Community Standard on Dangerous Individuals and Organisations.

Reviewing the case, the Board found that the quote did not support the Nazi party’s ideology or the regime’s acts of hate and violence. Comments on the post from the user’s friends supported the user’s claim that they sought to compare the Presidency of Donald Trump to the Nazi regime, as both, in the view of the user, aimed to appeal to (base) emotions and instincts, rather than to ‘intellectual’ factors such as truth and facts. On this basis, the Board – rightly – found that the removal of the post went against the user’s right to freedom of opinion.

COVID-19 ‘fake news’ in France? (Case Decision 2020-006-FB-FBR)

In October 2020, a user posted a video and accompanying text in French in a public Facebook group related to COVID-19. The post alleged a scandal at the Agence Nationale de Sécurité du Médicament (the French agency responsible for regulating health products), which refused to authorise hydroxychloroquine combined with azithromycin for use against COVID-19, but authorised and promoted remdesivir. The user criticised the lack of a health strategy in France and stated that ‘[Didier] Raoult’s cure’ is being used elsewhere to save lives. The user’s post also questioned what society had to lose by allowing doctors to prescribe a ‘harmless drug’ during an emergency.

In its referral to the Board, Facebook cited this case as an example of the challenges of addressing the risk of offline harms that may be caused by misinformation relating to the COVID-19 pandemic.

The Oversight Board overturned Facebook’s decision to remove the post.

Facebook removed the content for violating its misinformation and imminent harm rule, which is part of its Violence and Incitement Community Standard, finding the post contributed to the risk of imminent physical harm during a global pandemic. Facebook explained that it removed the post as it contained claims that a cure for COVID-19 exists. The company concluded that this could lead people to ignore health guidance or attempt to self-medicate.

The Board observed that, in the post, the user was opposing a governmental policy and wanted to change that policy. The combination of medicines that the post claims constitute a cure are not available without a prescription in France and the content does not encourage people to buy or take drugs without a prescription. Considering these and other contextual factors, the Board decided that Facebook had not demonstrated that the post reached the threshold of creating a risk of ‘imminent harm,’ as required by its own Community Standards.

The Board also found that Facebook’s decision did not comply with international human rights standards on limiting freedom of expression. Given that Facebook has a range of tools to deal with misinformation, such as providing users with additional context, the company failed to demonstrate why it did not choose a less intrusive option than removing the content.

On balance, it seems the Board again reached a sound decision in this case. They may have added that the exercise of freedom of speech and of opinion to criticise government policy is central to the functioning of democratic society.

Et tu Trumpe?

These early decisions, which broadly engender faith in Facebook’s new oversight process, are but the entrée ahead of a far more consequential main course: the Board’s upcoming review of Facebook’s decision to suspend former President Trump’s account indefinitely after the attack on the US Capitol on 6 January.

According to one of the Board’s Co-Chairs, Jamal Greene, a former Supreme Court clerk and Kamala Harris aide, and now Professor of Law at Columbia University: ‘This is the kind of case the oversight board is for.’

As the New York Times has noted, it is hard to imagine a more consequential case. The decisions by Twitter and Facebook to bar Trump immediately reshaped the American political landscape. ‘In the course of a few hours after the Capitol riots, they simply vaporised the most important figure in the history of social media.’

The board took up the case in late January, and will appoint a panel of five randomly selected board members, at least one of them American, to consider it and reach a decision. The full 20-person board will then review the decision, and could either reinstate Trump’s direct connection to millions of supporters, or sever it for good.

There are mixed views as to which way the Board is likely to swing. Scholars and human rights experts seem split between those worried about the implications of permanent bans for free speech (the CEO of Twitter, Jack Dorsey, and the Chancellor of Germany, Angela Merkel, share such concerns), and those worried about what the consequences would be for American democracy, should Facebook’s decision be overturned.

According to Noah Feldman, the Felix Frankfurter Professor of Law at Harvard Law School, who first brought the notion of a Facebook Supreme Court to the company, conservatives dismayed by the suspension might be surprised to find an ally in this new international institution. ‘They may come to realise that the Oversight Board is more responsive to freedom of expression concerns than any platform can be, given real world politics,’ he said.

Nick Clegg, Facebook’s vice president for global affairs, on the other hand, has said he is ‘very confident’ the Board would affirm the company’s decision to suspend Trump’s account, though less sure as to what recommendations it might make about allowing the former President to return to the platform in the future.

Whatever the final decision, it is clear it will have enormous consequences for human rights and democracy, not just in the US but around the world. This has led many, including UN Secretary-General Antonio Guterres, to question whether individual companies should have the power to regulate free speech in such a manner.

‘I do not think that we can live in a world where too much power is given to a reduced number of companies,’ he told reporters. ‘And I must say that I’m particularly worried with the power that they already have.’ Instead, Guterres appeared to call for intergovernmental institutions to take the lead in setting universal norms so that States can apply consistent rules through ‘regulatory frameworks’ that allow such decisions to be taken ‘in line with the law.’

Featured image: creative commons found at

Share this Post