
Facebook hasn’t been forthcoming about a program that exempts high-profile users from its community standards.
Sarah Tew/CNET
Facebook’s oversight board that reviews the social network’s content decisions said Thursday the company should be more transparent about the policies it applies to the accounts of high-profile users who are part of its Cross Check program.
The program, also known as XCheck, exempts posts by high-profile users, such as celebrities and political leaders, from the community standards Facebook applies to the rest of its users. Cross Check was the centerpiece of a Wall Street Journal story that launched a series on the company based on leaked documents that illuminate how much the social network knows about its effects on users.
The oversight board, a quasi-independent body set up by Facebook, said the company failed to provide relevant or complete information on high-profile user decisions, such as former President Donald Trump’s suspension.
“In the board’s view, the team within Facebook tasked with providing information has not been fully forthcoming on cross-check,” the board said in a statement. “On some occasions, Facebook failed to provide relevant information to the board, while in other instances, the information it did provide was incomplete.”
Facebook has requested a review of Cross Check, which the board has accepted.
Frances Haugen, the whistleblower who leaked the internal documents, will meet with the oversight board to discuss Cross Check and the company’s content moderation. The former Facebook product manager will appear before the UK Parliament next week and has already testified to Congress.
Critics of Facebook, which was used by Russia to influence the 2016 presidential election, say the company doesn’t take its responsibility seriously enough and don’t believe the oversight board moves fast enough or goes far enough. A group of vocal critics has set up a shadow organization, which it calls the Real Facebook Oversight Board.
To date, the board’s highest-profile action was upholding Facebook’s suspension of Trump’s Facebook and Instagram accounts. In May, the board said the social network was justified in suspending Trump amid concerns he could foment more violence after the deadly Capitol Hill riot on Jan. 6.
Here’s what you need to know about Facebook’s oversight board:
What are the board’s responsibilities?
Let’s get something straight: The oversight board doesn’t do the same job as content moderators, who make decisions on whether individual posts to Facebook comply with the social network’s rules. The board exists to support the “right to free expression” of Facebook’s nearly 3 billion users.
The board functions a lot like a court, which isn’t surprising given that a Harvard law professor came up with the idea. Users who believe content moderators have removed their posts improperly can appeal to the board for a second opinion. If the board sides with the user, Facebook must restore the post. Facebook can also refer cases to the board.
The oversight board can also make suggestions for changes to Facebook’s policies. Over time, those recommendations could affect what users are allowed to post, which could make content moderation easier.
Why does Facebook want an oversight board?
Facebook gets criticized by just about everybody for just about every decision it makes. Conservatives say the company and the rest of Silicon Valley are biased against their views. They point to the suspensions of Trump and right-wing extremist Alex Jones.
The social network doesn’t get much love from progressives, either. They complain Facebook has become a toxic swamp of racist, sexist and misleading speech. Some progressive groups underlined their concerns in summer 2020 by calling on companies to avoid advertising on Facebook and publicizing the boycott with the hashtag #StopHateForProfit.
The oversight board can help Facebook deal with those complaints while lending credibility to the social network’s community standards, a code of conduct that prohibits hate speech, child nudity and a host of other offensive content. By letting an independent board guide decisions about this content, Facebook hopes it will develop a more consistent application of its rules, which in the past have generated complaints for appearing arbitrary.
One example: Facebook’s 2016 removal of an iconic Vietnam War photo that shows a naked girl fleeing a napalm attack. The company defended the removal, saying the Pulitzer Prize winning image violated its rules on child nudity. Facebook reversed its decision shortly afterward as global criticism mounted about the removal of a vital historical image.
Why isn’t the board part of Facebook?
It’s no secret that Facebook has a trust problem. Regulators, politicians and the public all question whether the decisions the company makes serve its users or itself. Making the board independent of Facebook should, the company reckons, give people confidence that its decisions are being made on the merits of the situation, not on the basis of the company’s interests.
Who has Facebook chosen to be on this board?
In spring 2020, Facebook named the first 20 members of the board, a lineup that includes former judges and current lawyers, professors and journalists. It also includes a former prime minister and a Nobel Peace Prize winner. The board can be expanded to 40 people. The members have lived in nearly 30 countries and speak almost as many languages. About a quarter come from the US and Canada.
Serving on the board is a part-time job, with members paid through a multimillion-dollar trust. Board members will serve a three-year term. The board will have the power to select future members. It hears cases in panels of five members chosen at random.
Trump and conservatives were unhappy with the makeup of the board, which they saw as too liberal, according to The New Yorker. The former president even called CEO Mark Zuckerberg to express this sentiment, but Facebook didn’t change the board members.
Is the board really independent if Facebook is paying it?
If you’re skeptical, we hear you. Facebook doesn’t have a great reputation for transparency.
That said, the charter establishing the board provides details of the efforts Facebook is taking to ensure the board’s independence. For example, the board isn’t a subsidiary of Facebook. It’s a separate entity with its own headquarters and staff. It maintains its own website (in 18 languages, if you count US and UK English separately) and its own Twitter account.
Still, when it comes to money, the board is indirectly funded by Facebook through a trust. Facebook is funding the trust to the tune of $130 million, which it estimates will cover years of expenses.
Facebook says it will abide by the board’s decisions even in cases when it disagrees with a judgment. (The social network says the only exceptions would be decisions that would force it to violate the law, an unlikely occurrence given the legal background of many board members.)
The board will also try to keep Facebook accountable, publishing an annual report that’ll include a review of Facebook’s actions as a result of its decisions.
Read more: Here’s how you can submit an appeal to Facebook’s oversight board.
Tell me more about the Trump decision.
Sure. The board decided in May that Facebook was justified in suspending Trump out of concern the former president could incite violence after he whipped up supporters as Congress gathered to certify Joe Biden’s election. The decision, however, wasn’t a blanket endorsement of Facebook’s action, with the board taking exception to the open-ended nature of the penalty. The board said Facebook should reconsider the length of time that Trump was barred, and complete its review within six months.
The former president was kicked off Facebook and its Instagram photo-sharing service in the wake of the Jan. 6 Capitol Hill riot. Other social networks, including Twitter, also took action against Trump, who used their services to fan doubt over the legitimacy of the 2020 presidential election.
The case highlighted the difficult balance that private social media companies need to strike when handling political speech by public figures. Zuckerberg made the decision to ban Trump, who was still in office at the time, saying the risks of allowing Trump to continue posting were “simply too great,” the Facebook boss said at the time.