Law News, Briefings, Reports, & Legal Intelligence Resources
Employment & Labor

Facebook Pays Settlement To Content Moderators For Trauma

NPR just announced that Facebook reached a $52M payout following a class action lawsuit for failure to provide a safe working environment for its content moderators. The suit claims that as a result of the repeated exposure to extreme graphic content, many content moderators suffered "debilitating physical and psychological harm," including post-traumatic stress disorder (PTSD). The settlement gives class members, consisting of current or former content moderators, $1000 each and up to $50,000 for medical treatment. Facebook also agreed to provide mental health counseling to its moderators.
By Simpluris Research
  • Share

NPR just announced that Facebook reached a $52M payout following a class action lawsuit for failure to provide a safe working environment for its content moderators. To maintain a safe and "sanitized platform," Facebook contracts third-party content moderators to view user-generated content uploaded to Facebook's online platform. Posts they find that violate the corporation’s terms of use laid out in its Community Standards are removed. 

Most content moderators witness consists of "videos, images, and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder." The suit claims that as a result of the repeated exposure to extreme graphic content, many content moderators suffered "debilitating physical and psychological harm," including post-traumatic stress disorder (PTSD). The lawsuit claimed that "Facebook helped draft workplace safety standards to protect content moderators," which included "providing moderators with robust and mandatory counseling and mental health supports; altering the resolution, audio, size, and color of trauma-inducing images; and training moderators to recognize the physical and psychological symptoms of PTSD." However, it fell short in actually implementing these safety standards it helped create. Therefore, Facebook breached its duty to protect employees from unsafe working conditions required by California labor laws. 

This groundbreaking litigation fixed a major workplace problem involving developing technology and its impact on real workers who suffered in order to make Facebook safer for its users.
- Steve Williams

The settlement gives class members, consisting of current or former content moderators, $1000 each and up to $50,000 for medical treatment. Although the social media giant denies any wrongdoing, it agreed to provide mental health counseling to its moderators. A Facebook statement read, "We are grateful to the people who do this important work to make Facebook a safe environment for everyone. We're committed to providing them additional support through this settlement and in the future." Steve Williams, a lawyer representing the workers said,  "We are so pleased that Facebook worked with us to create an unprecedented program to help people performing work that was unimaginable even a few years ago. The harm that can be suffered from this work is real and severe."

Content Moderator

Content Moderator

Content Moderator

While this litigation served as a step in the right direction for moderators at the job site, the recent coronavirus outbreak has presented challenges about third-party moderators working from home. First, moderators working remotely will only be able to view a restricted amount of the usual content due to security and privacy concerns. Working from home presents an obstacle for information security because it lacks the same safeguards as in the office. When employees are at a worksite, they are working behind layers of preventive security controls. Working from home presents some security vulnerabilities. 

Another issue is the availability of wellness procedures and support for those working remotely. In a BBC report, Mark Zuckerbook said that the decisions regarding the most disturbing content "would be taken over by Facebook's full-time staff because the infrastructure was not in place to support the mental health repercussions of the contractors dealing with the posts." However, finding a solution to provide moderators with some level of psychological and health monitoring will need to be addressed. 

Third, content moderators' ability to hit accuracy goals in removing the toxic and disturbing content raises another question. Living in quarantine and isolation could possibly cause more depression and mental health issues, thus increasing the amount of inappropriate or harmful posts. Although Facebook employs 15,000 third-party content moderators, the sheer volume could cause some questionable content getting overlooked. 

The last issue is the use of artificial intelligence to help evaluate exploitative content on the social network. "Facebook has been working on algorithms for several years to automatically spot and remove content that violates its policies," but recently found some content was mistakenly censored from the platform due to "a spam-filter bug." Some glitches may emerge but the use of AI could be helpful to fill in some gaps. AI could also replace some jobs but not all. Humans seem better suited to decipher content and language in a cultural and political context. 

Overall, tech companies will need to re-examine current practices to protect their workers not only from a physical standpoint but also from a mental health perspective as many employees now work from home.

WANT TO CONTINUE READING?
Get Unlimited Access To Simpluris Insights—FREE
We won't ever sell your information. We'll only send a brief summary of legal industry news once in a while.
LET'S WORK TOGETHER
Schedule a free, expert consultation with a Simpluris Project Manager
So that we may best serve you, please let us know the following:
Ascertainability Numerosity
Federal-Rule-of-Civil-Procedure-23 Class-action-certification
We're excited to learn about your business.
First, a bit about you.
Something went wrong. Please try again
Class members may visit our case lookup screen to locate information related to your case.
Hi , do any of the following times work for you?
Hi , Please call us at 714-640-5606
All times are . Please allow for 30 minutes.
All available appointment slots are booked for the next few days.
  • Today

  • Tomorrow

  • Day After Tomorrow

  • Day 4

  • Day 5

  • Day 6

Please select at least one time slot.
Something went wrong. Please try again
LET'S WORK TOGETHER
Thanks , we look forward to meeting you !
Check your email for more info and a calendar invitation.