News

Facebook meeting shows challenges for the proposed ‘council of supervision’

SINGAPORE (Reuters) – Facebook’s new effort to use external experts in the content review process promises to be complex and potentially controversial, as discussions of this week at a meeting in Singapore are an indication.

FILE PHOTO: A Facebook sign is seen during the China International Import Expo (CIIE), at the National Exhibition and Convention Center in Shanghai, China November 5, 2018. REUTERS/Aly Song./File Photo

In the course of two days, 38 academics, nonprofit officials and others from 15 Asian countries who were invited to a Facebook workshop struggled with how a proposed external monitoring of the board” for content decisions to be able to function.

The meeting, the first of a half dozen planned for cities around the world, manufactured in one clear advice: the new board of directors is authorized to use the roads not only on specific cases, but on the policy and the processes behind them.

Facebook has long faced criticism for doing too little to block incitement to hatred, calls to violence, bullying and other forms of content that are in conflict with the “community standards.”

In Myanmar, for example, Facebook for years took little action while the platform was used to violence against the Rohingya minority.

But the company also puts the fire not doing enough to defend free expression. Activists accuse the company of taking down messages and blocking accounts for political or commercial reasons, an accusation it denies.

Facebook CEO Mark Zuckerberg launches the idea of an independent board of trustees in November of last year, and a draft charter was released in January.

“We want to find a way to the strengthening of a fair process and procedural justice,” Brent Harris, director of global affairs and corporate governance at Facebook, said at the opening of the Singapore meeting. A Reuters reporter was invited to observe the proceedings, on the condition that the names of the participants and some details of the talks not be made public.

Facebook’s original plan calls for a 40-person board that would function as a court of appeal on the content of decisions, with the power to make binding rulings on specific cases.

But as participants peppered Facebook officials with questions and worked through issues such as how the board of directors would be chosen and how it would select cases, they repeatedly came back to questions of policy. Statements about individual messages would mean little if they were not linked to the underlying content review procedures, many participants said.

Hate policy is a big focus of the discussion. Many participants said that they felt Facebook was often too lax and blind to the local conditions, but the company has company to the concept of a single set of international standards and a conscious preference for leaving the content on the site.

More than one million Facebook posts per day be reported for violations of the standards, which are detailed rules on everything from pictures of dead bodies (usually allowed) to explicit sexual conversations (usually not allowed).

The company is strengthening enforcement. It now has an army of 15,000 content reviewers, many of them with low wage contractors, entrusted with the checking of messages that are reported for violations, and decide what to delete. Difficult decisions, or those involving politically controversial questions, are often “escalated” the company is in the content of policy for the team.

One of the examples discussed at the Singapore meeting concerned, a post was reported that more than 2,000 times and evaluated 108 different times by different content moderators – which is closed each time the post is not in conflict with the standards and should continue.

But after it was escalated to contents policy staff members who had more information about the political context, it was removed. The participants of the meeting seemed to be unanimously of opinion that it must indeed have fallen.

The room was split almost evenly on a second case, where a sentence that some saw as a violation of the rules against hate speech, but others read like a joke. In that situation, the contents had remained in the service for many months before it was reported, and Facebook took it down.

Reporting by Jonathan Weber; Editing by Neil Fullick

Follow us

Don't be shy, get in touch. We love meeting interesting people and making new friends.

Most popular