Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Facebook’s independent oversight board is finally up and running

You can now complain to the board about Facebook taking down your posts.

Facebook CEO Mark Zuckerberg speaking at a conference in February 2020.
Facebook CEO Mark Zuckerberg speaking at a conference in February 2020.
Facebook CEO Mark Zuckerberg speaking at a conference in February 2020.
Abdulhamid Hosbas/Anadolu Agency via Getty Images
Shirin Ghaffary
Shirin Ghaffary was a senior Vox correspondent covering the social media industry. Previously, Ghaffary worked at BuzzFeed News, the San Francisco Chronicle, and TechCrunch.

Facebook’s much-anticipated independent decision-making body, the Facebook oversight board, announced it will start allowing people to submit cases for review beginning today.

That means that if you post something on Facebook or Instagram and it’s taken down for violating any of Facebook’s ever-changing rules on things like hate speech, nudity, misinformation, or violence — you will soon have the ability to appeal that decision to someone besides Facebook. For now, that option will roll out in waves, and in the next few weeks, Facebook says it’ll be an option for all users.

Social media experts have long awaited the Board’s launch because it’s expected to serve as the final decision-maker in how Facebook handles complicated and problematic posts, which have plagued the social media company. Look, for example, at how it managed the unsubstantiated New York Post article about Hunter Biden or any of the countless times Facebook has been accused of letting racist hate speech run rampant on its platform. The Board said it will prioritize cases that threaten to harm freedom of expression or human rights, but declined to comment on specific cases it plans to take.

Facebook’s oversight board is made up of a group of 20 academics, journalists, and international policy experts from around the world, and is set up as a separate company from Facebook, funded by a $130 million independent trust. Its decisions on individual pieces of content are binding, meaning Facebook has agreed to follow whatever decisions the Board makes, and the group can also make broader policy recommendations to Facebook — although those won’t be binding. That means the board has the power to overrule even Facebook CEO Mark Zuckerberg, who has a history of taking stubborn stances in the name of protecting free expression. Zuckerberg allowed President Trump’s “when the looting starts, the shooting starts” post in response to Black Lives Matter protests in Minneapolis, and, until recently, allowed Holocaust denialism on Facebook — even when some of his own employees, civil rights leaders, and others have raised serious concerns.

“The Board is eager to get to work,” said Catalina Botero Marino, co-chair of the oversight board, in a press statement on Thursday. “We won’t be able to hear every appeal, but want our decisions to have the widest possible value, and will be prioritizing cases that have the potential to impact many users around the world, are of critical importance to public discourse, and raise questions about Facebook’s policies.”

At a time when Facebook is being criticized by US politicians on both sides of the aisle for how it handles contentious speech on its platform, the Board is meant as an outside check on Facebook’s power. Some, though, have criticized the Board, saying it was too slow in getting started (Facebook CEO Mark Zuckerberg first publicly described the idea two years ago) and too narrow in scope to meaningfully change how Facebook handles hate speech and misinformation. For example, for now, users will only be able to appeal cases where they feel their content is wrongfully taken down, not cases in which they think inflammatory content is wrongfully staying up on the platform (the Board says that latter option will come in the next few months).

“Facebook was always criticized for moving fast and breaking things. I think we are looking at this as the opposite that,” said oversight board co-chair and former Danish Prime Minister Helle Thorning-Schmidt on a press call with reporters on Thursday.

Critics point out that the oversight board seems unlikely to help Facebook deal with one of the most controversial content moderation challenges it has faced to date: the 2020 US presidential election.

President Trump has been making unsupported assertions on Facebook and Twitter for months now that the election is “rigged,” centering on false claims about mail-in voting — which Facebook has labelled with a generic link to nonpartisan voting information, and Twitter has more aggressively — at times — labeled as “misleading” and fact-checked.

Many anticipate that Trump — or other politicians — could question the results of the election or declare a premature victory on social media before the race is called. In that case, it would be up to Facebook or Twitter to decide how to deal with such a declaration. (Facebook and Twitter have signaled they would fact-check and label such a post or even take it down, depending on what it says.) Whatever decision these companies make will be widely controversial.

But it seems unlikely the Board will take any cases in time to impact election-night posts or regulate misinformation in the remaining days until the election.

That’s because it will take up to 90 days for the Board to decide on a case — and that’s after the Board even figures out which cases it wants to hear first. Facebook the company can submit a case to the Board for expedited review, but on a press call with journalists on Thursday morning, the company said it will not send any cases to the Board before November 3.

“We are not going to send something for expedited review before the election,” said head of strategic initiatives at Facebook Brent Harris. “And we have done that because we do not wish to place undue pressure on the board.”

Last month, a group of 25 experts from academia, civil rights, politics, and journalism announced they were creating an ad-hoc group to scrutinize Facebook’s oversight board, calling themselves “The Real Facebook Oversight Board.”

Facebook oversight board’s Thorning-Schmidt said she welcomes the feedback.

“We welcome all debate on this,” she said. “Part of the reason why we have joined this course is because we want to debate around content moderation.”

More in Technology

Podcasts
Are humanoid robots all hype?Are humanoid robots all hype?
Podcast
Podcasts

AI is making them better — but they’re not going to be doing your chores anytime soon.

By Avishay Artsy and Sean Rameswaram
Future Perfect
The old tech that could help stop the next airborne pandemicThe old tech that could help stop the next airborne pandemic
Future Perfect

Glycol vapors, explained.

By Shayna Korol
Future Perfect
Elon Musk could lose his case against OpenAI — and still get what he wantsElon Musk could lose his case against OpenAI — and still get what he wants
Future Perfect

It’s not about who wins. It’s about the dirty laundry you air along the way.

By Sara Herschander
Life
Why banning kids from AI isn’t the answerWhy banning kids from AI isn’t the answer
Life

What kids really need in the age of artificial intelligence.

By Anna North
Culture
Anthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque messAnthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque mess
Culture

“Your AI monster ate all our work. Now you’re trying to pay us off with this piece of garbage that doesn’t work.”

By Constance Grady
Future Perfect
Some deaf children are hearing again because of a new gene therapySome deaf children are hearing again because of a new gene therapy
Future Perfect

A medical field that almost died is quietly fixing one disease at a time.

By Bryan Walsh