Facebook Board Overrules Company on Most Cases in First Test
Deciding its first cases, Facebook Inc’s oversight board ruled on Thursday that the social media company was wrong to remove four of five pieces of content the board reviewed, including posts Facebook took down for violating rules on hate speech and harmful COVID-19 misinformation.
The rulings are a crucial test of the independent body created by Facebook in response to criticism of how it treats problematic content. The board also called for Facebook to be clearer about its rules on what is allowed on its platforms.
After the company last week, the board was in the spotlight asked it to rule on the recent suspension of former U.S. President Donald Trump. It said on Thursday it would soon be opening the case up for public comment.
Facebook blocked Trump’s access to his Facebook and Instagram accounts over concerns of further violent unrest following the Jan. 6 storming of the U.S. Capitol by the former president’s supporters.
Facebook’s oversight board started hearing cases in October and announced the first cases it would review in December. Here is the full list of the board’s rulings:
Decisions Overturned:
- A post with photos of a deceased child that included commentary on China’s treatment of Uighur Muslims.
- An alleged quote from Nazi propaganda minister Joseph Goebbels that Facebook removed for violating its policy on “dangerous individuals and organizations.”
- A post in a group claiming certain drugs could cure COVID-19, which criticized the pandemic’s French government’s response. This case was submitted by Facebook, rather than a user.
- Instagram photos showing female nipples that the user in Brazil aimed to raise awareness of breast cancer symptoms. Facebook had also said this removal was an error and restored the post.
Decision Upheld:
- A post that purported to show historical photos of churches in Baku, Azerbaijan, with a caption that Facebook said indicated “disdain” for Azerbaijani people and support for Armenia.
Facebook now has seven days to restore the pieces of content that the board ruled should not have been taken down. The board said it would shortly announce one more case from its first batch.
The board also issued nine nonbinding policy recommendations – for example, Facebook should tell users the specific rule they have violated and better define their rules on issues like dangerous groups and health misinformation. Facebook doesn’t have to act on these, but it does have to respond publicly.
Facebook has long faced criticism for high-profile content moderation issues, ranging from temporarily removing a famous Vietnam-era war photo of a naked girl fleeing a napalm attack to failings in policing hate speech and misinformation.
The Board
The board will rule on a limited number of controversial decisions. On Thursday that 150,000 cases had been appealed to the board since it started accepting cases in October.
The board has 20 members, including former Danish Prime Minister Helle Thorning-Schmidt and Nobel Peace Prize laureate Tawakkol Karman.
The panel hears cases from users who have exhausted the company’s appeals process on content removed from Facebook’s platforms, not content that has been left up. The board’s limited remit has been the subject of criticism. Facebook itself can ask the board to review a wider range of content problems.
Before the rulings were announced, a group of Facebook critics, dubbed The Real Oversight Board, said they were “a PR effort that obfuscates the urgent issues that Facebook continually fails to address – the continued proliferation of hate speech and disinformation on their platforms.”
Facebook has pledged $130 million to fund the board for at least six years.
(Reporting by Elizabeth Culliford; Editing by Kenneth Li, Cynthia Osterman, and Steve Orlofsky)
Related Articles
How to Relax With CBD
Best Valentine’s Day Ideas on Amazon
Want stories like this delivered straight to your inbox? Stay informed. Stay ahead. Subscribe to InqMORNING