• Tech
  • facebook

Former Content Moderator Sues Facebook, Claiming the Job Gave Her PTSD

3 minute read
Updated: | Originally published: ;

A content moderator at Facebook filed a class action lawsuit against the company on Friday, claiming it does not protect employees from the mental trauma caused by the graphic images they see online every day.

Selena Scola, a content moderator responsible for viewing and removing any Facebook posts that violate the platform’s terms of use, said she suffers from psychological trauma and post-traumatic stress disorder as a result of “constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace,” the complaint reads.

“Every day, Facebook users post millions of videos, images and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder,” according to the complaint.

Facebook, which currently employs at least 7,500 content moderators, has put workplace safety standards into place to protect content moderators, including counseling and mental health support, changing the way traumatic images appear and training moderators to be able to recognize the symptoms of PTSD. But the lawsuit claims Facebook ignores its own workplace safety standards and violates California harm by requiring its moderators to work in “dangerous conditions that cause debilitating physical and psychological harm.”

“Facebook does not provide its content moderators with sufficient training or implement the safety standards it helped develop,” the complaint said. “Facebook content moderators review thousands of trauma-inducing images each day, with little training on how to handle the resulting distress.”

The lawsuit details that Scola’s PTSD symptoms can be triggered if she touches a computer mouse, enters a cold building, sees violence on TV or hears loud noises. Remembering or discussing the graphic imagery she saw on Facebook is also a trigger.

Facebook is reviewing the claim, director of corporate communications Bertie Thomspon said in a statement. Thomson said the company provides mental health support to content moderators.

“We recognize that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources,” Thomson said. “Facebook employees receive these in house and we also require companies that we partner with for content review to provide resources and psychological support, including onsite counseling – available at the location where the plaintiff worked – and other wellness resources like relaxation areas at many of our larger facilities.”

Facebook said in July that all content reviewers have access to mental health resources, including onsite counselors, and that all reviewers have full health care benefits.

More Must-Reads from TIME

Write to Mahita Gajanan at mahita.gajanan@time.com