Amnesty International accused TikTok of insufficiently protecting vulnerable children and teenagers from psychologically harmful content on Monday, reconfirming 2023 findings that young users interested in mental health-related content are often led to a “rabbit hole” of videos featuring depressive themes and suicidal ideation. It called for the company to move to a rights-respecting business model instead of maximizing engagement based on collecting intimate data.
The study focused on TikTok-use in France. Using three test accounts and appropriate research controls, the group found that after seeking mental health content on the platform, accounts increasingly displayed depressive messaging and videos that romanticized suicide on the app’s “For you” page. For instance, two videos of the so-called “lip balm challenge” surfaced on the feed of one test account. Critics say the “challenge” induces self-harm, and has won major media attention in France, although its existence has been denied by TikTok.
Experts claim this algorithmic tendency has led to concerning outcomes. In 2021, 15-year-old Marie Le Tiec took her own life, and was found to have viewed harmful TikTok content leading up to her suicide. Her mother, with other aggrieved parents and family members, sued the company for failing to moderate such content. TikTok has denied any wrongdoing, stating that it has 40,000 moderators and forwards users to mental health support if they search directly for topics like “self-harm” or “suicide.”
The organization alleged that TikTok’s disregard for systemic harm arises from its engagement-driven model, its “addictive design,” and hyper-personalized algorithm. Consequently, TikTok allegedly fails to abide by Business and Human Rights standards, as well as the 2023 European Digital Services Act (DSA), which requires platforms to identify and mitigate systemic risks to children. The European Commission opened formal proceedings against TikTok in 2024 under the DSA, which Amnesty aims to supplement with its report.
Psychology experts are unsure whether TikTok content creates depressive and suicidal ideation in all children or whether it primarily affects psychologically vulnerable kids. Marion Haza, clinical psychologist and lecturer at the University of Poitiers, has claimed that harmful content is only truly dangerous for minority of users who already commit self-harm. The researcher has argued that strictly prohibitive measures, especially in adolescence, do not address root factors that lead to self-harm, stating that group relationships and community support are more effective ways to mitigate mental health risks. Haza noted that this social connection is something that platforms like TikTok can also help in providing.
Grégoire Bors, professor of psychology and cognitive neuroscience at Paris-Cité, has said it is very difficult to find a clear connection between social media-use and self-harm, citing a leading peer-reviewed study which found that only 0.4 percent of the differences in teen-mental health could be attributed to the apps.
In contrast, a Amnesty study carried out at the end of 2024 shows that 58 percent of young people surveyed are negatively affected by disturbing content they watch, and only one-in-five has successfully avoided depressive content on their “For you” page. As one young interviewee states, this might be also related to the fact that likes on depressive posts, which fuel the algorithm to show more similar posts, are often meant by users to express support or care, and not as liking the content or a suicidal attempt in particular. The report, in contrast to Haza,further highlighted community sections and bubbles which encourage and romanticize suicide.
Global leaders have begun proposing legislation to address potential threat posed to children by social media. In September, a French parliamentary report recommended the government implement a “digital curfew” for children under 16-years old. In Australia, the government has implemented a controversial social media ban for children under 16, which will come into force in December.