Facebook content oversight board. What you need to know.

The social network’s so-called supreme court for content-moderation decisions will launch just ahead of the hotly contested US presidential election.

After nearly two years, Facebook’s much-anticipated content oversight board has committed to an October launch, just ahead of a heated US presidential election that’ll likely play out on social media.

The board will have the power to uphold or overturn decisions made by Facebook to pull posts that the social network says violates its rules. It could also review posts that Facebook leaves up but that might violate Facebook’s standards. Dubbed Facebook’s supreme court, the board will have 20 members, including former judges, lawyers, and journalists. Its responsibility extends to Instagram, Facebook’s photo-sharing service.

Getting to launch has been a time-consuming process. Facebook CEO Mark Zuckerberg first publicly suggested the concept in November 2018. Members were named a year and a half later.

“Since the first 20 Oversight Board members were appointed back in May, we have been helping to get them up and running as quickly as possible,” a Facebook spokesperson said in a statement, adding that the members have access to a new software tool that lets them log in securely from anywhere in the world. “We look forward to the board beginning to hear cases in mid to late October.”

The timing of the launch will likely provide plenty of grist for the board to consider. Facebook has maintained a mostly hands-off approach to posts by politicians, including President Donald Trump and Democratic challenger Joe Biden, though it’s begun adding a label that directs users to credible election information. The social network has also reportedly started prepping contingency plans in case Trump tries to delegitimize the election results.

The effort hasn’t mollified critics. A day after the oversight board announced its target launch, a group calling itself the Real Facebook Oversight Board said it would review and analyze the social network’s content moderation decisions and policies. It will conduct its sessions in public on Wednesdays at 2 p.m. ET for anyone to watch. The platform: Facebook Live.

Here’s what you need to know about Facebook’s oversight board:

Sounds like this board will have a lot of responsibility. What can it do?

Let’s get something straight: The oversight board isn’t going to do the same job as content moderators, who make decisions on whether individual posts to Facebook comply with the social network’s rules. The board exists to support the “right to free expression” of Facebook’s 2.7 billion users.

The board functions a lot like a court, which isn’t surprising given that a Harvard law professor came up with the idea. Users who believe content moderators have removed their posts improperly can appeal to the board for a second opinion. If the board sides with the user, Facebook must restore the post. Facebook can also refer cases to the board. The process is expected to take up to 90 days.

The oversight board can also make suggestions for changes to Facebook’s policies. Over time, those recommendations could affect what users are allowed to post. In turn, those changes could make content moderation easier.

Why does Facebook need an oversight board in the first place? 

Facebook gets criticized by just about everybody for just about every decision it makes. Conservatives say the company is biased against their views. They point to bans of right-wing provocateurs Alex Jones and Milo Yiannopoulos to support their case.

The social network doesn’t get much love from progressives, either. They complain Facebook has become a toxic swamp of racist, sexist, and misleading speech. In July, some progressive groups underlined their concerns by calling on companies not to advertise on Facebook and publicizing the boycott with the hashtag #StopHateForProfit.

The oversight board can help Facebook deal with those complaints while lending credibility to the social network’s community standards, a code of conduct that prohibits hate speech, child nudity, and a host of other offensive content. By letting an independent board guide decisions about this content, Facebook hopes it’ll develop a more consistent application of its rules, which in the past have generated complaints for appearing arbitrary.

One example: Facebook’s 2016 removal of an iconic Vietnam War photo that shows a naked girl fleeing a napalm attack. The company defended the removal, saying the Pulitzer Prize-winning image violated its rules on child nudity. It reversed its decision shortly afterward as global criticism mounted, prompting COO Sheryl Sandberg to apologize to Norway’s prime minister.

Got it. But why does Facebook need to create an independent organization? 

It’s no secret that Facebook has a trust problem. Regulators, politicians, and the public all question whether the decisions the company makes serve its users or itself. Making the board independent of Facebook should, the company reckons, give people confidence that its decisions are being made on the merits of the situation, not on the basis of the company’s interests.

OK. So who has Facebook chosen to be on this board?

Earlier this year, Facebook named the first 20 members of the board, a lineup that includes former judges and current lawyers, as well as professors and journalists. It also includes a former prime minister and a Nobel Peace Prize winner. The board could eventually be expanded to 40 people.

The social network chose a diverse group. The members have lived in nearly 30 countries and speak almost as many languages. About a quarter come from the US and Canada.

At the time of the announcement, Helle Thorning-Schmidt, who served as Denmark’s prime minister from 2011 to 2015, said one of the board’s biggest advantages would be removing some of the content-moderation responsibility from Facebook itself. As it stands, she said, the decision making is too centralized.

“Social media can spread speech that is hateful, deceitful and harmful,” she said. “And until now, some of the most difficult decisions around content have been made by Facebook, and you could say ultimately by Mark Zuckerberg.”

Serving on the board is a part-time job, with members paid through a multimillion-dollar trust. Board members will serve a three-year term. The board will have the power to select future members. It’ll hear cases in panels of five members chosen at random.

Wait a minute. Facebook is paying the board? Is it really independent?

If you’re skeptical, we hear you. Facebook doesn’t have a great reputation for transparency.

That said, the charter establishing the board provides details of the efforts Facebook is taking to ensure the board’s independence. For example, the board isn’t a subsidiary of Facebook; it’s a separate entity with its own headquarters and staff. It maintains its own website (in 18 languages, if you count US and UK English separately) and its own Twitter account.

Still, when it comes to money the board is indirectly funded by Facebook through the trust. Facebook is funding the trust to the tune of $130 million, which it estimates will cover years of expenses.

Facebook says it’ll abide by the board’s decisions even in cases when it disagrees with a judgment. (The social network says the only exceptions would be decisions that would force it to violate the law, an unlikely occurrence given the legal background of many board members.)

The board will also try to keep Facebook accountable, publishing an annual report that’ll include a review of Facebook’s actions as a result of its decisions.

“It’ll be very embarrassing for Facebook,” Thorning-Schmidt said, “if they don’t live up to their end of this bargain.”

Credit to CNET for an informative piece.

Leave a Reply

This site uses cookies to offer you a better browsing experience. By browsing this website, you agree to our use of cookies.