A content moderator at Facebook filed a class action lawsuit against the company on Friday, claiming it does not protect employees from the mental trauma caused by the graphic images they see online every day.
“Every day, Facebook users post millions of videos, images and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder,” according to the complaint.
Facebook, which currently employs at least 7,500 content moderators, has put workplace safety standards into place to protect content moderators, including counseling and mental health support, changing the way traumatic images appear and training moderators to be able to recognize the symptoms of PTSD. But the lawsuit claims Facebook ignores its own workplace safety standards and violates California harm by requiring its moderators to work in “dangerous conditions that cause debilitating physical and psychological harm.”
“Facebook does not provide its content moderators with sufficient training or implement the safety standards it helped develop,” the complaint said. “Facebook content moderators review thousands of trauma-inducing images each day, with little training on how to handle the resulting distress.”
The lawsuit details that Scola’s PTSD symptoms can be triggered if she touches a computer mouse, enters a cold building, sees violence on TV or hears loud noises. Remembering or discussing the graphic imagery she saw on Facebook is also a trigger.
Facebook is reviewing the claim, director of corporate communications Bertie Thomspon said in a statement. Thomson said the company provides mental health support to content moderators.
“We recognize that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources,” Thomson said. “Facebook employees receive these in house and we also require companies that we partner with for content review to provide resources and psychological support, including onsite counseling – available at the location where the plaintiff worked – and other wellness resources like relaxation areas at many of our larger facilities.”
Facebook said in July that all content reviewers have access to mental health resources, including onsite counselors, and that all reviewers have full health care benefits.
- Taylor Swift Is TIME's 2023 Person of the Year
- Meet the Nation Builders
- Why Cell Phone Reception Is Getting Worse
- Column: It's Time to Scrap the Abraham Accords
- Israeli Family Celebrates Release of Hostage Grandmother
- In a New Movie, Beyoncé Finds Freedom
- The Top 100 Photos of 2023
- Want Weekly Recs on What to Watch, Read, and More? Sign Up for Worth Your Time