California trial over social media harm to children begins News
StockSnap, CC0, via Wikimedia Commons
California trial over social media harm to children begins

Opening statements in a landmark trial involving some of the world’s biggest social media companies, including Google, YouTube and Meta, opened in Los Angeles County Superior Court on Monday, to determine the platforms’ alleged harmful effects on children.

The case will test claims about addiction and whether the tech giants can be held liable for harming children or if they have protection under the First Amendment. 

One plaintiff, identified as KGM, a 19-year-old California woman, alleged she became addicted to social media apps as a child, which led her to suffer physical and emotional harm. She claimed the addictive app design negatively impacted her mental health.

KGM is one of many plaintiffs suing the social media companies in a number of similar cases that accuse the tech companies of “designing their products to be addictive, especially to children.” In the complaint, the plaintiff explained:

Borrowing heavily from the behavioral and neurobiological techniques used by slot machines and exploited by the cigarette industry, Defendants deliberately embedded in their products an array of design features aimed at maximizing youth engagement to drive advertising revenue. Defendants understand that a child user today becomes an adult user tomorrow.

KGM argued that a lack of user warnings about certain types of content on the platforms leads to “compulsive use and mental health concerns such as depression, anxiety, body dysmorphia, self-harm, and risk of suicide.”

Ashley Simonsen, Meta’s attorney, argued in the pleadings that KGM‘s addiction stemmed from content created by other Instagram users, not any specific design feature of the website. She explained that social media app features, like the “infinite scroll,” which allows users to access a never-ending stream of content, are “content-neutral tools” that aid in facilitating communication and therefore qualify as “‘protected publishing activity’ under the First Amendment and Section 230.”

Even if the defense loses, it will not mean platforms suddenly become liable for all user-generated content,” Simonsen said. “The narrower significance is that courts may treat addictive or unsafe design as actionable conduct distinct from protected content moderation or publication.”

TikTok and Snapchat were originally named in the suit but have since settled for undisclosed amounts.