A Northern California woman hired to review flagged Facebook content has sued the social media giant after she was “exposed to highly toxic, unsafe, and injurious content during her employment as a content moderator at Facebook,” which she says gave her post traumatic stress disorder (PTSD). 

Selena Scola moderated content for Facebook as an employee of contractor Pro Unlimited, Inc. between June 2017 and March of this year, according to her complaint. 

“Every day, Facebook users post millions of videos, images, and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder,” the lawsuit reads. “To maintain a sanitized platform, maximize its already vast profits, and cultivate its public image, Facebook relies on people like Ms. Scola – known as “content moderators” – to view those posts and remove any that violate the corporation’s terms of use.

“You’d go into work at 9am every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see. Heads being cut off,” one content moderator recently told the Guardian.

According to the lawsuit, Facebook content moderators are asked to review over 10 million potentially rule-breaking posts per weekwith an error rate of less than one percent – and a mission to review all user-reported content within 24 hours. Making the job even more difficult is Facebook Live, a feature that allows users to broadcast video streams on their Facebook pages. 

The Facebook Live feature in particular “provides a platform for users to livestream murder, beheadings, torture, and even their own suicides, including the following:” 

In late April a father killed his 11-month-old daughter and livestreamed it before hanging himself. Six days later, Naika Venant, a 14-year-old who lived in a foster home, tied a scarf to a shower’s glass doorframe and hung herself. She streamed the whole suicide in real time on Facebook Live. Then in early May, a Georgia teenager took pills and placed a bag over her head in a suicide attempt. She livestreamed the attempt on Facebook and survived only because viewers watching the event unfold called police, allowing them to arrive before she died.

As a result of having to review said content, Scola says she “developed and suffers from significant psychological trauma and post-traumatic stress disorder (PTSD)” – however she does not detail the specific imagery she was exposed to for fear of Facebook enforcing a non-disclosure agreement (NDA) she signed. 

Scola is currently the only named plaintiff in the class-action lawsuit, however the lawsuit says that the potential class could include “thousands” of current and former moderators in California. 

As Motherboard reports, moderators have to view a constant flood of information and use their judgement on how to best censor content per Facebook’s “constantly-changing rules.” 

Moderating content is a difficult job—multiple documentaries, longform investigations, and law articles have noted that moderators work long hours, are exposed to disturbing and graphic content, and have the tough task of determining whether a specific piece of content violates Facebook’s sometimes byzantine and constantly-changing rules. Facebook prides itself on accuracy, and with more than 2 billion users, Facebook’s work force of moderators are asked to review millions of possibly infringing posts every day. –Motherboard

“An outsider might not totally comprehend, we aren’t just exposed to the graphic videos—you’ll have to watch them closely, often repeatedly, for specific policy signifiers,” one moderation source told Motherboard. “Someone could be being graphically beaten in a video, and you could have to watch it a dozen times, sometimes with others present, while you decide whether the victim’s actions would count as self-defense or not, or whether the aggressor is the same person who posted the video.” 

The lawsuit also alleges that “Facebook does not provide its content moderators with sufficient training or implement the safety standards it helped develop … Ms. Scola’s PTSD symptoms may be triggered when she touches a computer mouse, enters a cold building, watches violence on television, hears loud noises, or is startled. Her symptoms are also triggered when she recalls or describes graphic imagery she was exposed to as a content moderator.”

Facebook told Motherboard that they are “currently reviewing the claim.”

We recognize that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources,” the spokesperson said. “Facebook employees receive these in house and we also require companies that we partner with for content review to provide resources and psychological support, including onsite counseling—available at the location where the plaintiff worked—and other wellness resources like relaxation areas at many of our larger facilities.”

“This job is not for everyone, candidly, and we recognize that,” Brian Doegan, Facebook’s director of global training, community operations, told Motherboard in June. He said that new hires are gradually exposed to graphic content to “so we don’t just radically expose you, but rather we do have a conversation about what it is, and what we’re going to be seeing.” 

Doegan said that there are rooms in each office that are designed to help employees de-stress. –Motherboard

“What I admire is that at any point in this role, you have access to counsellors, you have access to having conversations with other people,” he said. “There’s actual physical environments where you can go into, if you want to just kind of chillax, or if you want to go play a game, or if you just want to walk away, you know, be by yourself, that support system is pretty robust, and that is consistent across the board.”

Read the lawsuit below: 

The post Facebook Sued By PTSD-Stricken Moderator Over “Rape, Torture, Bestiality, Beheadings, Suicide And Murder” appeared first on crude-oil.news.

By admin