Facebook’s Oversight Board is accepting appeals from Facebook and Instagram users about other people’s content that’s been allowed to remain on the platforms.
The board, which was set up last October, is able to make binding decisions about Facebook’s decisions over whether or not to remove content – even overruling the platform and executives in the process.
Since its launch, the $130 million (£105 million) board has allowed users to appeal to the board about their own content being removed.
But now Facebook is expanding its capacity so people can also appeal over content posted by others that has been allowed to remain on either of the platforms, both of which are owned by Facebook.
Users can file appeals over posts, photos, videos, comments, statuses and shares that they think the company should have been removed.
Facebook´s Oversight Board is to begin accepting appeals from users about other people’s content which has been allowed to remain on the platforms. The Oversight Board, which launched last October, rules over what is allowed or removed from the platform
How to report content that’s been left up
If someone does not think that a piece of content should be on Facebook or Instagram, they first need to report the content to Facebook.
The Facebook Help Center (facebook.com/help) explains how to do this for various types of content.
If Facebook decides to keep the content up even after review, the reporting person will receive an Oversight Board Reference ID in their Support Inbox.
From there can appeal Facebook’s decision to the Oversight Board.
The ‘independent’ board, which comprises 20 members and is known as Facebook’s ‘supreme court’, said the move is an important step towards delivering a more principled and transparent model of content moderation.
‘Enabling users to appeal content they want to see removed from Facebook is a significant expansion of the Oversight Board’s capabilities,’ said Oversight Board administration director Thomas Hughes.
‘The board was created to ensure that fewer decisions about highly significant content issues be taken by Facebook alone, and that better decisions can be delivered through an independent and transparent process that works to safeguard human rights and freedom of expression.
‘Today’s announcement is another step towards realising this.’
Just as before, content eligible for appeal to the board still includes posts, statuses, photos, videos, comments and shares.
In a blog post confirming the expanded capacity for the board, Facebook said it will be rolling out the functionality over the coming weeks.
‘Today’s announcement represents an expansion of the board’s initial scope,’ said Guy Rosen, VP of Integrity at Facebook, in the post.
‘Starting today [Tuesday], people who use Facebook and Instagram now have the ability to appeal other people’s content that has been left up to the Oversight Board.
Facebook announced its Oversight Board, which rules over what is allowed or removed from the platform, in October. The news of the so-called Facebook ‘supreme court’ arrived amid rising concerns about misinformation and manipulation around the US election
‘We expect everyone on Facebook and Instagram to be able to appeal content left up over the coming weeks.’
The Oversight Board makes binding rulings on whether posts or ads violate the company’s standards.
Since its introduction in October 2020, if content was removed from Facebook or Instagram and a user disagreed with Facebook’s re-reviewed decision to keep it down, that content was eligible for final appeal to the board.
A panel of experts to rule on content was first proposed by Facebook founder Mark Zuckerberg in 2018, as a ‘supreme court’ that could overrule decisions made by the company, before it eventually materialised two years later.
The board says on its website: ‘As its community grew to more than two billion people, it became increasingly clear to the Facebook company that it shouldn’t be making so many decisions about speech and online safety on its own.
‘The Oversight Board was created to help Facebook answer some of the most difficult questions around freedom of expression online – what to take down, what to leave up and why.’
The panel of experts to rule on content was first proposed by Facebook founder Mark Zuckerberg (pictured) in 2018
The board is made up of 20 panel members, although this number is expected to eventually grow to 40.
According to a report earlier this year in the New Yorker, Facebook’s Oversight Board members each earn six-figure salaries and only work about 15 hours a week.
The board has received criticism for its ‘left-leaning’ members, which include ex-Guardian editor Alan Rusbridger and Neil Kinnock’s daughter-in-law.
Critics accused Zuckerberg of ‘blowing’ his chance of setting up a ‘meaningful’ and ‘politically balanced’ oversight committee because so few of its members have conservative credentials.
In March, Rusbridger, a member of the Oversight Board, touted its independence and said it does not exist to ‘please’ the social network.
During an appearance before the House of Lords Communications and Digital Committee, Rusbridger said scuppering Facebook’s economic model is ‘not our problem’.
‘If you wanted to please Facebook, I think you’d have chosen a different group of people,’ he said.
‘In my experience of my colleagues so far, they’re quite bolshy, they don’t want to have anything to do with Facebook, they turf Facebook out of our meetings when we realise there are some people sitting there.
‘So we don’t feel we work for Facebook at all and so I don’t think there’s any obligation either to be nice to Facebook or be horrible to Facebook.’
Some news agencies, however, have referred to the board as ‘quasi-independent’, as Facebook provided the funding and helped choose the board members.
WHAT YOU NEED TO KNOW ABOUT FACEBOOK’S CONTENT OVERSIGHT BOARD
WHAT DOES THE OVERSIGHT BOARD REVIEW?
The board, which some have dubbed Facebook’s ‘supreme court’, rules on whether some individual pieces of content should be displayed on the site. It can also recommend changes to Facebook’s content policy, based on a case decision or at the company’s request.
The board reviews posts, videos, photos and comments that the company has decided to remove from Facebook or its photo-sharing site Instagram, as well as cases where content was left up.
This could be content involving issues such as nudity, violence or hate speech. Facebook has said the board’s remit will in future include ads, groups, pages, profiles and events, but has not given a time frame.
It will not deal with Instagram direct messages, Facebook’s messaging platforms WhatsApp, Messenger, its dating service or its Oculus virtual reality products.
Facebook expects the board will initially take on only ‘dozens’ of cases, a small percentage of the thousands it expects will eventually be brought to the board. In 2019, users appealed more than 10 million pieces of content that Facebook removed or took action on.
But Facebook’s head of global affairs, Nick Clegg, previously told Reuters he thought the cases chosen would have a wider relevance to patterns of content disputes.
HOW DOES THE BOARD WORK?
The board decides which cases it reviews, which can be referred either by a user who has exhausted Facebook’s normal appeals process or by Facebook itself for cases that might be ‘significant and difficult’.
Users who disagree with Facebook’s final decision on their content have 15 days to submit a case to the board through the board’s website.
Each case is reviewed by a panel of five members, with at least one from the same geographic region as the case originated.
The panel can ask for subject matter experts to help make its decision, which then must be finalised by the whole board.
The board’s case decision – which is binding unless it could violate the law – must be made and implemented within 90 days, though Facebook can ask for a 30-day expedited review for exceptional cases, including those with ‘urgent real-world consequences’.
Users will be notified of the board’s ruling on their case and the board will publicly publish the decision.
When the board gives policy recommendations, Facebook will give public updates and publish a response on the guidance and follow-on action within 30 days.
For more details on the board’s operations, see Facebook’s proposed bylaws.
WHO IS ON THE OVERSIGHT BOARD?
The board will eventually have about 40 members.
Facebook chose the four co-chairs – former US federal circuit judge Michael McConnell and constitutional law expert Jamal Greene from the United States, Colombian attorney Catalina Botero-Marino and former Danish Prime Minister Helle Thorning-Schmidt – who then jointly selected the other 16 members named so far.
Some were sourced from the global consultations conducted by Facebook to obtain feedback on the oversight board.
The members, who are part-time, so far include constitutional law experts, civil rights advocates, academics, journalists, a Nobel Peace Prize laureate and a former judge of the European Court of Human Rights.
The members are paid by a trust that Facebook has created and will serve three-year terms for a maximum of nine years.
The trustees can remove a member before the end of their term for violating the board’s code of conduct, but not for content decisions.
Thomas Hughes, former executive director for freedom of expression rights group Article 19, was appointed to oversee the board’s full-time administrative staff.