I was looking a little at reporting around the FB and Tiktok moderators who sued the companies or are trying to perhaps still (unsure) in class action suits, and found this one, Mar. 2022, Techcrunch:
The lawsuit alleges that TikTok and ByteDance violated California labor laws by failing to provide Velez and Young with adequate mental health support in spite of the mental risks of the “abnormally dangerous activities” they were made to engage with on a daily basis. It also claims that the companies pushed moderators to review high volumes of extreme content to hit quotas and then amplified that harm by forcing them to sign NDAs so they were legally unable to discuss what they saw.
“Defendants have failed to provide a safe workplace for the thousands of contractors who are the gatekeepers between the unfiltered, disgusting and offensive content uploaded to the App and the hundreds of millions of people who use the App every day,” the lawsuit states. It alleges that in spite of knowing the psychological risks of prolonged exposure to such traumatic content, TikTok and ByteDance made no effort to provide “appropriate ameliorative measures” to help workers cope with the extreme content after the fact.
What I’d like to know is actually, what the hell are “appropriate ameliorative measures” for content moderators, either while they’re still doing the job, or after they’ve stopped, and are still experiencing negative fallout psychologically?
I’ve looked it into it a bit, and as far as I’ve seen, there’s really no commonly agreed on industry standard of what this means. Is this lawsuit asking for something that doesn’t actually even exist?