Facebook: moderators suffer from post-traumatic stress, content is too violent
Facebook is an essential social network for keeping in touch with your friends wherever you are. Accessible online or in mobile application for Android and iOS.
- Downloads:
230 - Release date:
12/18/2024 - Author :
Facebook Inc - License:
Free license - Categories:
Internet – Communication
- Operating system:
Android, Online service, Windows 10/11, iOS iPhone / iPad
Every day, very violent graphic content circulates on Facebook. Moderators find themselves faced with images of murders, suicides, sexual abuse of minors and terrorism. Among them, 140 suffer from post-traumatic stress, as explained by Doctor Ian Kanyanya, head of mental health services at Kenyatta National Hospital in Nairobi.
Moderators exposed to violent content
Faced with this very worrying diagnosis for the mental health of the moderators, legal action was taken against Meta and Samasource Kenya, the service provider. Daily exposure to violent content caused generalized anxiety disorders and major depressive disorders in moderators.
Dr. Ian Kanyanya also found severe or extremely severe post-traumatic stress symptoms in 81% of cases that persisted for at least a year after a moderator left. At least 40 have developed an addiction to alcohol (cannabis, cocaine, amphetamine) and medications such as sleeping pills.
Exposure of moderators to images and videos of necrophilia, zoophilia or self-mutilation caused extreme physical reactions: fainting, vomiting, screaming and fleeing workstations. Those responsible for removing content from terrorist groups live in fear of being monitored and targeted.
Others reported marital breakdowns, loss of sexual desire, and family isolation. Even more worrying, four moderators developed trypophobia: an aversion to repeated patterns of small holes after seeing images of decomposing bodies.
Very difficult working conditions
Working conditions at Meta's subcontractor facility in Kenya are very difficult. The moderators, from several African countries, work between 8 and 10 hours a day in a cold space with bright lights and constant monitoring of their activity. Between 2019 and 2023, their salary was eight times lower than for their American colleagues who moderated content from the African continent.
The moderators took legal action on several grounds: deliberate psychological endangerment, unfair employment practices, human trafficking, modern slavery and unfair dismissal.
Foxglove is a British organization supporting this legal action. Martha Dark, its founder, explains that “The evidence is irrefutable: Facebook moderation is a dangerous job that inflicts lifelong post-traumatic stress on almost everyone who does it. In Kenya, it traumatized 100% of the hundreds of former moderators tested.”
As for Meta, the company says it takes moderator safety seriously. Contracts with subcontractors outline requirements for advice, training, on-site support and access to private healthcare. The social network says it offers above-market salaries and uses techniques like blurring, sound suppression and monochrome rendering to limit exposure to violent content.