A former content moderator is suing Facebook for giving her PTSD — Quartz

A former Facebook content moderator is suing the company for exposing her to countless distressing images that she claims have given her post traumatic stress disorder (PTSD), a condition frequently diagnosed in soldiers who participate in combat, for example. She also alleges that Facebook did not provide adequate mental health care to the moderators, who have to examine some of the internet’s worst content on a daily basis.

The former moderator, Selena Scola, filed the complaint in a court in San Mateo, California, on Friday (Sept. 21). If the court agrees, it will become a class-action lawsuit, since, the document alleges, Scola’s experience was “typical” of the legions of moderators that Facebook hires. The complaint was first reported by Motherboard.

The complaint does not contain a specific account of Scola’s daily work, because she fears violating the terms of a non-disclosure agreement signed with Facebook. Scola worked as a content moderator from June 2017 to March 2018, at Pro Unlimited, a Facebook contractor, which is also named as a defendant in the suit.

From the complaint, we only know that she was “exposed to thousands of images, videos, and livestreamed broadcasts of graphic violence.” She developed “debilitating PTSD” during that time, and continues to suffer from it today, the complaint says.

“Ms. Scola’s PTSD symptoms may be triggered when she touches a computer mouse, enters a cold building, watches violence on television, hears loud noises, or is startled. Her symptoms are also triggered when she recalls or describes graphic imagery she was exposed to as a content moderator.”

Facebook did not have comment at the time of publication.

There is a variety of ways moderators can be at least in part shielded from the harmful influence of looking at disturbing content, and, as the complaint notes, Facebook helped create industry standards in this regard (this includes limiting the time of exposure, distorting the images in various ways, and teaching moderators coping strategies). But, the complaint says, “Facebook does not provide its content moderators with sufficient training or implement the safety standards it helped develop.”

Approximately 7,500 content moderators currently work on Facebook content. The company plans on increasing this number as its user base grows, and as it continues to face controversies over allowing harmful posts to linger on the platform. Each day, reviewers look through about a million pieces of content that was previously flagged by users or artificial intelligence. Many of them are based overseas, and are paid low wages.

Past accounts from content moderators—working for Facebook and other tech companies—and investigations by undercover journalists have shed some light on the daily horrors the moderators face, including beheadings, sexual violence against children, graphic animal abuse, and livestreams of suicides.

Last year, a content moderator at Microsoft sued the company, claiming that after viewing images and videos of child abuse and killings, he suffered from hallucinations and spending time around his son would trigger traumatic recollections.

In the case against Facebook, Scola wants the company to form a medical monitoring program to diagnose and treat content moderators for psychological trauma. The program would be court-supervised and Facebook funded.

Be the first to comment

Leave a Reply

Your email address will not be published.


*