Amnesty International (AI) published a statement Monday condemning TikTok’s failure to address the previously identified risks to youth’s mental health.
AI reported TikTok’s responses to previous inquiries into implemented changes since a 2023 report. The responses denied the creation of user profiles from collected data and listed steps that were taken to protect youth on the app. AI noted that, in their responses, TikTok failed to address the “rabbit hole” problem, pointed out measures that were already in place when the 2023 research was being compiled, and placed the responsibility of protecting the youth from negative outcomes of the app onto the youth themselves and their guardians.
In November 2023, AI published a report on TikTok’s “for you” feed encouragement of self-harm and suicidal ideation. The report also focused on the addictiveness of the app, connecting it to worsening mental health, sleep and attention disturbance, and “changes in brain structure similar to those observed in people experiencing drug addiction.” In the conclusion, AI addressed inadequate responses towards identified problems and urged TikTok to undertake due diligence when it comes to the safety of the youth using the app.
AI’s concerns are amplified by the global legal and legislative responses to TikTok. For example, in 2023, the Irish Data Protection Commission fined TikTok €345 million over inadequate handling of personal data, noting a substantial risk to children.
The increase in legislative actions against the app can be seen in Somalia, Nepal, and the US state of Montana, which banned TikTok in 2023, citing either users’ data handling concerns, influence on youth with deadly outcomes, or negative effects on social harmony.
There has also been a wave of legal actions against TikTok. Both the US state of Texas and the US Justice Department sued the app for violating children’s privacy laws. In 2024, 14 US state attorneys general filed a lawsuit against TikTok, citing the harm to the mental health of children; they noted an increase in depression, anxiety, eating disorders, and suicidal ideation in US youth. A month later, families in France filed a lawsuit against the app for exposing children to the promotion of suicide, self-harm, and eating disorders after two 15-year-old girls took their own lives.