HYDERABAD, India/SAN FRANCISCO (Reuters) – On a busy day, employees in India monitoring nude and porn on Facebook and Instagram will each display of 2,000 posts in an eight hour shift, or almost four per minute.
FILE PHOTO: A woman looks at the Facebook logo on an iPad in this picture, figure 3 June 2018. REUTERS/Regis Duvignau/Image/File Photo
They are part of a 1,600-member team at Genpact, an outsourcing company with offices in the south Indian city of Hyderabad, which has been hired to review Facebook content.
Seven content reviewers of Genpact said in interviews at the end of last year and beginning in 2019 that their work is underpaid, stressful and sometimes traumatic. The reviewers, all in their 20s, declined to be identified for fear of losing their job or violating the non-disclosure agreements. Three of the seven have left, Genpact in the last few months.
“I’ve seen that female employees are breaking down on the floor, reliving the trauma of watching the number of suicides in real-time,” a former employee said. He said that he had seen that this happens at least three times.
Reuters was unable to independently verify the incident or to determine how often they have occurred.
Genpact declined to comment.
The working conditions described by the workers provides a window to the moderator activities on Facebook and the challenges of the company: it wants the police what the 2 billion users. Their account contrasts in several respects with the picture presented by three Facebook executives in interviews and statements to Reuters, a carefully selected, skilled workforce that is well paid and has the tools for the process of a difficult task.
Ellen Silver, Facebook, vice-president of operations, confirmed to Reuters that the content moderation “of this size is uncharted territory”.
“We give a lot about getting this right,” she said in January. “This includes the training received, the reviewers, our hiring practices, the wellness resources we provide to each and every person revision of content, and our engagement with the partners.”
While the decline of the Hyderabad employees’ claims about the low pay, Facebook has said that it had started to draft a code of conduct for outsourcing partners, but refused to provide details.
It has also been said that the introduction of an annual compliance audit of the supplier of the policies of this year to review the work at the contractor facilities. The company is organizing a first-ever summit in April to bring together in its outsourcing suppliers from all over the world, with the aim of sharing best practices and bring more consistency to how moderators are treated.
These efforts were announced in a blog post on Monday by Justin Osofsky, Facebook’s vice president of global operations.
Facebook works with at least five outsourcing vendors in at least eight countries to review content, a Reuters tally shows. Silver said about 15,000 people, a mix of contractors and employees, who were busy with checking the contents on Facebook as of December. Facebook had more than 20 satisfied review sites all over the world, ” she said.
More than a dozen moderators in other parts of the world spoken of similar traumatic experiences.
A former Facebook contract employee, Selena Scola, filed a lawsuit in California in September, with the assertion that the content moderators, who are faced with psychological trauma after viewing disturbing images on the platform are not well protected by the social networks of the company.
Facebook in a court filing, has denied all of Scola’s allegations and called for a dismissal, contending that Scola has sufficient reasons to complain.
Some examples of traumatic experiences between Facebook content moderators in the United States were described this week by The Verge, a technology news website. (a little bit.ly/2EammsL)
PRESSURE, LACK OF EXPERIENCE
The Genpact unit in Hyderabad reviews messages in Indian languages, Arabic, English and a number of Afghan and Asian tribes and dialects, according to Facebook.
On a team, employees spend their days by revision of nudity and explicit pornography. The “counter-terrorism” team, meanwhile, view videos that beheadings, car bombings and electric shock torture sessions, the employees said.
On the “self-harm” unit regularly watch-live-videos of suicide attempts – and not always succeed in alerting authorities of the time, two of the employees said. They told Reuters they had no experience with suicide or trauma.
Facebook said its policy called for moderators to warn of a “specially trained team” to review situations where there was “potential danger or damage.”
The moderators who spoke with Reuters said in the cases, they knew of the trained team was called in if there was a possibility of a suicide, but the critics continued to monitor the feed, even after the team had been warned.
Jobs and the salary pay-slips seen by Reuters showed an annual fee at Genpact for an entry-level Facebook Arabic content reviewer to 100,000 Indian rupees ($1,404) per year, or slightly more than $6 per day. Facebook said that benefits was the real spend is much higher.
The employees said that they had received from the transportation to and from work, a common non-cash benefit in India.
Moderators in Hyderabad in the service of another IT company, Accenture, monitor Arabic content on YouTube on behalf of Google for a minimum of 350,000 rupees annually, on the basis of two of his employees and pay slips seen by Reuters. Accenture declined to comment, citing confidentiality.
Facebook disputed the pay analysis, saying Genpact is required to pay above the industry average. The principal, while declining to comment on its work for Facebook, said in a statement that its wages are “significantly higher than the standard in the industry or the minimum wage set by the law.”
THE BIG TARGETS
The Genpact moderators in Hyderabad said Facebook the targets, which are reviewed from time to time, that are the so-called Average time or Average Handling Time.
“We have to meet an accuracy of 98 percent on large targets,” one of the moderators told Reuters. “It’s just not easy when you are constantly being bombarded with stuff that is usually mind-numbing.”
The logo of Genpact is to be seen on the facade of the building in Bengaluru, India, January 29, 2019. Photo: January 29, 2019. REUTERS/Munsif Vengattil
She said that she often took work home on their laptops to keep.
Silver said the handling time was kept to assess whether or not Facebook should be more reviewers, and whether the policies are clear enough. But they recognized some of the older procedures have led the moderators to feel pressured.
The company also said that it was the increasing restrictions on employees’ access to the tools.
Reporting Munsif Vengattil in Hyderabad and Paresh Dave in San Francisco; Writing by Patrick Graham; Editing by Jonathan Weber and Raju Gopalakrishnan