News

Facebook is huge, secret code, for the control of the speech reveals inconsistencies, gaps, and prejudices

connectVideoFacebook the tipping point?

Facebook is an attempt to address incorrect information and hatred that the platform has been enabled with a huge, byzantine and secret document of rules filled with spreadsheets and powerpoint slides is periodically updated to the general content moderators.

According to the blockbuster New York Times, the rules show the social network to be “a much more powerful arbiter of the world’s voice is publicly recognized or acknowledged by the company itself.” The Times discovered a series of holes, bias, and outright errors, including cases in which Facebook allowed extremism to spread in some regions, while the censor regular speech in others.

The rules of the game details were revealed Thursday night thanks to a Facebook employee who leaked more than 1400 pages of the speech policing rules on the Times, because he “feared that the company’s exercise of too much power, with too little oversight and makes too many mistakes.”

CLICK HERE FOR THE FOX NEWS APP

CEO Mark Zuckerberg’s company is trying to check billions of posts per day in more than 100 languages during the parsing of the subtle nuances and complex context of language, images, and even emojis. The group of Facebook employees who meet every other Tuesday to update the rules, according to the Times, try down extremely complex problems in the strict yes-or-no-rules.

The Menlo Park, California. company then pays the content moderation to other companies that tend to hire unskilled workers, according to the newspaper report. The 7,500-plus moderators “have only a few seconds to recall the numerous rules and the application thereof to the hundreds of messages that dash on their screen every day. When is a reference to “jihad” for example, is forbidden? When is a “crying laughing” emoji a warning?”

Some moderators vented their frustration at the Times referred to messages left up could lead to violence. “You feel if you killed someone by not acting,” one said, speaking anonymously because he was the signing of a confidentiality agreement. Moderators also revealed that they are faced with the pressure to review a thousand pieces of content per day, with only eight to 10 seconds to judge each post.

The Times probe, published by a wide range of slides of the rules of the game, ranging from easy to understand to have something on their head-scratching, and detailed numerous cases where Facebook talk rules just fails. The guidelines for the Balkans appear “dangerously out of date,” an expert in that area, told the newspaper. A lawyer in India, found “worrying flaws” in Facebook’s guidelines, which relate to his country.

Video

FACEBOOK GAVE TECH COMPANIES ‘INTRUSIVE’ ACCESS TO PEOPLES’ PRIVATE MESSAGES AND PERSONAL DATA

In the US, Facebook is prohibited Proud of the Boys, an extreme-right group that was accused of inciting real-world violence. It is also blocked an ad on the caravan of Central American migrants taken by President Trump’s political team.

“It is not our place for the right people at the speech, but we want to maintain our community standards on our platform,” Sara Su, a senior engineer on the News Feed, told the Times. “If you’re in our community, we want to make sure that we’re balancing the freedom of expression and security.”

Monika Bickert, Facebook’s head of global policy management, said that the primary purpose was to prevent damage, and that to a large extent, the company have been successful. But perfection, she said, is not possible.

“We have billions of messages every day, we are identifying more and more potential violations with the help of our technical systems,” Bickert told the newspaper. “On that scale, even if you are 99% accurate, you are going to be a lot of mistakes.”

ELON MUSK SAYS ‘PEDO’ INSULTING THE THAI CAVE OF THE CAREGIVER IS IN THE FIRST AMENDMENT-PROTECTED SPEECH

Facebook’s most politically consequential and potentially disruptive document can be in an Excel spreadsheet that the Times reports lists each group and the individual company has blocked as a “hate figure.” Moderators are told to delete the message, praise, support, or a representative of one of the people on that list.

Anton Shekhovtsov, an expert in extreme right-wing groups, told the publication he was “confused about the methodology.” The company doesn’t allow an impressive range of American and British groups, he added, but relatively low in countries where the extreme right can be intense, in particular Russia or Ukraine.

Still, there is inconsistency in how Facebook is of application of the rules. In Germany, where speech is generally more scrutinized, Facebook reportedly dozens of blocks of extreme-right groups. In the nearby Austria, just a few streets.

For a technical company to draw these lines is “very problematic” Jonas Kaiser, a Harvard University and an expert on online extremism, told the Times. “It put social networks in the position to make decisions that have traditionally been the task of the judge.”

When Mark Zuckerberg is willing to testify before the Senate in April, 100 cardboard cut-outs of the Facebook founder and CEO were placed outside the Capitol building in Washington.
(Avaaz)

HOW SECURE IS YOUR MESSAGING APP?

About how Facebook identifies hatred, the Times reports:

The guidelines for the identification of hate, a problem that bedeviled Facebook, run 200 jargon-filled, head-turning pages. Moderators have sort of a post in one of the three “layers” of severity. They need to keep in mind lists such as the six “designated dehumanizing comparisons,” among them compare with the Jews of rats.

In June, internal e-mails allegedly showed that the moderators were told to allow users to the praise of the Taliban, which is normally prohibited, as they called the group decides to enter into a cease-fire. In a separate e-mail obtained by the newspaper, moderators had to find and remove the rumors that wrongly accused an Israeli soldier of killing a Palestinian doctor.

The Times research concludes by saying that a major obstacle to cracking down on inflammatory speech is Facebook itself, which is based on an algorithm for the growth that is notoriously promotes “the most provocative content, sometimes of the kind the company says that it wants to suppress.”

“Many of these would be a lot easier if there were authoritative third parties that was the answer,” Brian Fishman, a counterterrorism expert who works with Facebook told the Times.

Visser explains: “One of the reasons why it is difficult to talk about, because there is a lack of societal agreement on where this kind of authority should lie.”

Follow us

Don't be shy, get in touch. We love meeting interesting people and making new friends.

Most popular