Tiktok sued by former content moderator traumatised by disturbing videos News
HaticeEROL / Pixabay
Tiktok sued by former content moderator traumatised by disturbing videos

A former content moderator Friday sued TikTok and its parent company ByteDance for failing to provide a safe work environment against psychological trauma resulting from continued exposure to graphic and objectionable content. The plaintiff Candie Frazier filed this class action suit on behalf of all others similarly situated before the US central district court of California.

Frazier works for the firm Telus International which provides content moderators to TikTok. She alleges that content moderators spend 12 hours a day reviewing videos with graphic and objectionable content, including child sexual abuse, rape, bestiality, beheadings, suicide, and genocide. They are also repeatedly exposed to conspiracy theories, propaganda, denial of historical facts, fringe beliefs, and political disinformation.

She alleges that moderators are punished for taking time away from watching videos, and their performance is monitored closely,  causing them severe harm:

As a result of constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace, Ms. Frazier has developed and suffers from significant psychological trauma including anxiety, depression, and posttraumatic stress disorder.

Frazier further alleges that TikTok and ByteDance are aware of these adverse psychological effects but have failed to adopt industry-standard safety norms such as allowing more frequent breaks to moderators, mental health support, and treatment, and technical safeguards like blurring or reducing the resolution of videos under review. She contends that TikTok has thus acted negligently and failed to provide a safe working environment mandated under California’s labor laws. She is asking that the court order TikTok to compensate existing and former content moderators as well as adopt better safety standards and a medical safety fund.

Last year, a similar lawsuit filed by Facebook’s content moderators who developed PTSD resulted in a $52 million settlement agreement with the tech firm promising to reform its content moderation tools and provide access to regular psychological support.