Facebook moderators PTSD-like symptoms of gruesome and violent images, secondary content

(Jaap Arriens/NurPhoto via Getty Images)

Facebook is a low-paid army of content moderators, who are often the victims of poor working conditions, suffering from PTSD-like symptoms when they are exposed to on a daily basis to some of the worst, and secondary content posted to the social network, according to a scathing new investigative report by The Verge.

The tech publication starts with a description of how Chloe, a content moderator at Phoenix, Ariz.-based Cognizant – where 1000 people work to make very quick decisions under intense pressure over the fact or the content that is highlighted is in breach of the Facebook rules – that day has to moderate posts in front of her fellow soon-to-be-moderators as they are trained.

“The video shows a man being murdered. Someone stabbing him dozens of times, while he screams and begs for his life. Chloe’s job is to tell that the room or this message should be removed. They know, that article 13 of the Facebook community standards doesn’t allow videos, that images of the murder of one or more persons. When Chloe explains this to the class, she can hear her voice shaking,” The Verge reports, adding that she later leaves the room and cries so hard she can hardly breathe.


Facebook, which faced criticism from all corners for the content moderation mistakes and for the huge game rules that are guides for moderators, had more than 30,000 employees who are working on safety and security by the end of last year. Of that, about half of its content moderators and the tech-giant is based on the contract of labor for the most of the work. In the face of a never-ending firehose of content, the moderators can be considered as a 95 percent accuracy during the view of more than 1000 posts per week to see if they violate the Facebook community standards.

The Edge of the report is based on interviews with a dozen former and current employees at its Height, a soul-crushing, morbid environment where employees joke about self-harm, do drugs on the job, developing severe anxiety or panic attacks because of the gruesome content that they are forced to watch. Most of the moderators interviewed stop after one year.

The Phoenix moderators, according to the report, make about $ 28,000 per year, while the average Facebook full-time employee earns $240,000. In contrast to the perk-filled life on Facebook Frank Gehry-designed Menlo Park, California. headquarters, moderators in Phoenix are closely monitored by managers and assigned very short breaks for the use of the bathroom or the so-called spa time.

In addition, moderators told the tech-news site that some colleagues have even embraced the fringe, conspiracy-laden views of the memes and posts they are forced to every day.


Mark Zuckerberg, ceo and founder of Facebook Inc. attends the Viva Tech start-up and technology meeting at the Parc des Expositions Porte de Versailles, on May 24, 2018, in Paris.
(Getty Images)

Both the Height and Facebook pushed back on a number of aspects of The Edge of the report.

Bob Duncan, who oversees the Height of the content moderation operations in North America, told The Point that recruiters carefully explain the graphic nature of the work to the applicants. “The aim of all this is to make sure that people understand. And if they do not feel that the work is potentially suitable for them based on their situation, they can make these decisions depending on the case.”


In a later stage of the reporting, Facebook allowed The Eve of the reporter visited the Phoenix site after to tell her that the moderators’ experiences do not reflect those of the majority of contractors, whether in Phoenix or worldwide. New positive-message posters were put up and several content moderators who spoke on The Edge of satisfaction with their work and how they are treated, and to claim that it is very terrible, violent content is only a small fraction of what they view.

If the reporter asks one of the present professionals have the opportunities for employees to develop PTSD, ” he tells the reporter about something called “post-traumatic growth.”

The Section concludes that the “call center model of content moderation is taking an ugly toll of many of his employees. As first responders on platforms with billions of users, they are performing a critical function of the modern civil society, while they are less than half as much as many others who work in the front lines. They do work as long as they can, and when they leave, a non-disclosure agreement ensures that this dlc even further into the shadows.”


A former contract moderator sued Facebook in September, claims that her work for the tech giant and left her with PTSD.

Follow us

Don't be shy, get in touch. We love meeting interesting people and making new friends.

Most popular