A Former Content Moderator Is Suing Facebook Because Her Job Gave Her PTSD

A former Facebook content moderator says that watching graphic videos of murder, rape and abuse gave her PTSD.

A former Facebook content moderator has sued the company for the PTSD she says she experienced after being exposed to disturbing images. Alexander Koerner/Getty Images

A former content moderator for Facebook (META) has sued the social media company after alleging that she developed post-traumatic stress disorder from being exposed to “videos, images and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder” while on the job.

Lead plaintiff Selena Scola, of San Francisco, worked for the contractor Pro Unlimited, Inc., which helps Facebook delete content that violates its community standards, between June 2017 and March 2018. She says she experienced “constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace,” which led to her developing PTSD.

The lawsuit, Scola v. Facebook Inc. and Pro Unlimited Inc., which was filed on September 21 in California state court in San Mateo County, claims that content moderators are not being properly protected. “Facebook is ignoring its duty to provide a safe workplace and instead creating a revolving door of contractors who are irreparably traumatized by what they witnessed on the job,” Korey Nelson, of the law firm Burns Charest LLP, told Yahoo Finance.

The suit contends that Facebook does not provide sufficient health services or training methods to prepare or care for employees who take on the task of moderating content posted by the platform’s more than 2 billion global users.

“Ms. Scola’s PTSD symptoms may be triggered when she touches a computer mouse, enters a cold building, watches violence on television, hears loud noises or is startled,” the suit states. “Her symptoms are also triggered when she recalls or describes graphic imagery she was exposed to as a content moderator.”

Facebook has approximately 7,500 content moderators worldwide whose responsibility is to delete hate speech, graphic violence, images and videos containing self-harm, nudity and sexual content, bullying and other content that violates its policies.

“We aren’t just exposed to the graphic videos—you’ll have to watch them closely, often repeatedly, for specific policy signifiers,” a source told Vice’s Motherboard.

Scola’s lawsuit is asking the court to create a “Facebook-funded medical monitoring program” to provide testing and care to content moderators with PTSD.

A Former Content Moderator Is Suing Facebook Because Her Job Gave Her PTSD