Facebook Sued Over "Exposure To Disturbing Images" That Caused Trauma

A 3D plastic representation of the Facebook logo is seen in this illustration in Zenica

The lawsuit says Facebook did not do enough to protect its moderators from harm Credit Dado Ruvic Reuters

Facebook, which now employs at least 7,500 content moderators, has put workplace safety standards into place to protect content moderators, including counseling and mental health support, changing the way traumatic images appear and training moderators to be able to recognize the symptoms of PTSD. The firm is seeking class-action status for the lawsuit. "Instead, the multibillion-dollar corporation affirmatively requires its content moderators to work under conditions known to cause and exacerbate psychological trauma", the suit alleges.

The lawsuit does not go into further detail about Ms. Scola's particular experience because she signed a non-disclosure agreement that limits what employees can say about their time on the job.

As a content moderator, Scola was responsible for sifting through some of the most graphic and offensive content posted to the site in order to prevent it from being widely seen and shared.

Former content manager Selena Scola has lodged a suit against the social media company, claiming "constant and unmitigated exposure to highly toxic and extremely disturbing images" had left her with post-traumatic stress disorder.

The lawsuit, filed on September 12 in state superior court in San Mateo County, California, says Facebook content moderators working under contracts have to look at thousands of "videos, images and live-streamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder" every day, according to a press release.

According to Business Insider, content moderators regularly view "child abuse imagery, graphic violence, terrorist propaganda, and other ugly material" as a part of their work.

Scola worked at Facebook's offices in Menlo Park and Mountain View, California, for nine months from June previous year, under a contract through Pro Unlimited, a Florida-based staffing company.

Why would Facebook be liable?

"Facebook ignores the workplace safety standards it helped create".

"Facebook needs to mitigate the harm to content moderators today and also take care of the people that have already been traumatized", he added.

It asks that Facebook and its third party outsourcing companies provide content moderators with proper mandatory onsite and ongoing mental health treatment and support, and establish a medical monitoring fund for testing and providing mental health treatment to former and current moderators.

Altre Notizie