NAIROBI/SAN FRANCISCO (Reuters) – Facebook Inc’s battle with hatred and other forms of problematic content are hampered by the company is not able to keep up with a flood of new languages, such as mobile phones, social media to every corner of the world.
An illustration photo shows the Facebook page displayed on a mobile phone internet browser held in front of a computer screen in a cyber-cafe in the centre of Nairobi, Kenya, April 18, 2019. REUTERS/Stringer
The company offers its 2.3 billion users functions, such as menus and instructions in 111 different languages, are deemed to be officially supported. Reuters has 31 languages spoken on Facebook that no official support.
Detailed rules known as “community standards,” which bar users from posting offensive material, including hate and the celebration of violence, have been translated into 41 languages of the 111 supported from the beginning of March, Reuters found.
Facebook is for 15,000-strong content moderation workforce speaks about 50 languages, but the company said it hires professional translators when needed. Automated tools for identifying hate working in about 30.
The language deficit complicates Facebook the fight to rein in harmful content and the damage it can cause, including from the company itself. Countries, including Australia, Singapore and the united kingdom are now threatening harsh new regulations, punishable by steep fines or jail time for executives, if it fails to immediately remove the offensive posts.
The european standards are updated every month and run about 9,400 words in the English language.
Monika Bickert, Facebook’s vice-president responsible for the standards, has previously told Reuters that she had “a heavy lift to be translated in many different languages.”
A Facebook spokeswoman said this week the rules are translated from case to case, depending on whether a language is a critical mass of the use and whether or not Facebook is a primary source of information for the speakers. The spokeswoman said that there is no specific phone number for a critical mass.
She said priorities for translation are Khmer, the official language in Cambodia, and Sinhala, the dominant language in Sri Lanka, where the government blocked Facebook this week the tribe of rumors about the devastating Easter Sunday bombing.
A Reuters report found last year that incitement to hatred on Facebook that helped promote ethnic cleansing in Myanmar was not checked, partly because the company was slow to add moderation tools and personnel for the local language.
Facebook says that it is now of the lines in the Burmese language and has more than 100 speakers of the language among the employees.
The spokeswoman said that Facebook’s efforts to protect people against harmful content had “a level of language investment that meets the most of the technology company.”
But human rights officials say that Facebook is in danger of a repetition of the Myanmar issues in other conflict-torn countries where the language capabilities have not kept up with the impact of social media.
“These are designed to the rules of the road and both the customers and the regulators should continue to insist on the social media platforms and make the rules known and effectively police them,” said Phil Robertson, deputy director of Human Rights Watch’s Asia Division. “Failing to do this opens the door to serious abuses.”
ABUSE IN FIJI
Mohammed Saneem, the supervisor of elections in Fiji, said he felt the impact of the language gap during the elections in the South of the Pacific nation in November last year. Racist comments spread on Facebook in Fiji, the social network does not support. Saneem said he had a pitchfork for e-mail messages and the translation of a Facebook employee in Singapore search result.
Facebook said that it will not request of translations, and it gave Reuters a post-election letter from Saneem praised its “timely and effective assistance.”
Saneem told Reuters that he appreciated the help, but had expected that the pro-active measures taken by Facebook.
“If they are allowing users to post in their language, there must be guidelines are available in the same language,” he said.
Similar problems are there in abundance in African countries such as Ethiopia, where deadly ethnic clashes in a population of 107 million euro is accompanied by ugly Facebook content. Much of it is in Amharic, a language that is supported by Facebook. But the Amharic users the rules to get them in English.
At least 652 million people worldwide speak languages that are supported by Facebook, but where the rules are not to be translated, according to the data of language encyclopaedia Ethnologue. Another 230 million or more to speak of one of the 31 languages that are not official support.
Facebook makes use of automated software as an important protection against inappropriate content. Developed using a type of artificial intelligence known as machine learning, these tools to identify hate in about 30 languages, and “terrorist propaganda” in 19, the company said.
Machine learning requires huge amounts of data to train computers, and a scarcity of text in other languages is a challenge in a fast growing tools, Guy Rosen, Facebook’s vice-president responsible for the automated enforcement of the policy, told Reuters.
THE GROWTH OF REGIONS
Outside of automation, and a few official fact-checkers, Facebook is based on users to report problematic content. That creates a major problem where the community standards are not understood or even not known.
Ebele Okobi, Facebook’s director of public policy for Africa, told Reuters in March that the whole continent had the lowest rates of the user of the report.
“A lot of people do not even know that there are community standards,” Okobi said.
Facebook has bought radio ads in Nigeria and worked with local organizations to change that, ” she said. It has also held talks with the African education officials to social media etiquette into the curriculum, ” she said.
At the same time, Facebook is working with wireless carriers and other groups to expand internet access in countries such as Uganda and the Democratic Republic of the Congo, where it has yet to officially support common languages such as Luganda and Kituba. Asked this week about the expansions without the support of the language, Facebook declined to comment.
The company announced in February are the first 100 sub-Saharan Africa on the basis of content moderators at an outsourcing facility in Nairobi. They will be working with the existing teams in the assessment of the content in the Somali, Oromo and other languages.
But the european standards are not translated into Somali and Oromo. Messages in Somali last year, the celebration of the al-Shabaab militant group remained on Facebook for months, despite a ban on the glorification of organisations or actions which are in Facebook designates as terrorist.
“Unbelievers and the apostates, who, with your anger,” read a post seen by Reuters this month that praised the killing of a Sufi cleric.
After Reuters inquired about the post, Facebook said that the names of the author’s account because it violated policy.
ABILITY TO DERAIL
Messages in Amharic reviewed by Reuters this month attacked the Oromo and Tigray ethnic populations in vicious terms that clearly violated Facebook’s ban on discussion of ethnic groups on the basis of “violent or degrading speech, statements of inferiority, or calls for exclusion.”
Facebook deleted the two messages from Reuters inquired about. The company added that it has an error that’s preventing one of them from December 2017, remain online after a previous user report.
For officials such as Saneem in Fiji, Facebook efforts to improve the content of moderation and the support of the language are painfully slow. Saneem said he warned Facebook months prior to the elections in the archipelago of 900,000 people. Most of them use Facebook, with half of the writing in English and half in Fiji, he estimated.
“Social media is the ability to completely derail an election,” Saneem said.
Other social media companies face the same problem in varying degrees.
Slideshow (3 Images)
Facebook-owned Instagram said its 1,179-word community guidelines are in 30 of the 51 languages offered to users. WhatsApp, owned by Facebook, the terms and conditions in nine of the 58 supported languages, Reuters found.
Alphabet Inc YouTube presents directives of the community in 40 of the 80 available languages, Reuters found. Twitter Inc. be the rules in 37 of 47 supported languages, and Snap Inc. ‘ s in 13 out of 21.
“A lot of misinformation is spread about and the problem with content publishers is the hesitation to go,” Saneem said. “They do owe a duty of care. ”
Reporting by Maggie Fick in Nairobi and Paresh Dave in San Francisco; Additional reporting by Alister Doyle in Fiji and Omar Mohammed in Nairobi; Editing by Jonathan Weber and Raju Gopalakrishnan