A courtroom in Los Angeles has become the opening arena for a case that could redraw the boundaries of responsibility in the social media age. At its heart is a blunt allegation: that some of the world’s biggest platforms were built not just to engage young users, but to keep them coming back at any cost.
The case centers on a 20-year-old California woman, identified in court as Kaley G.M., who says she was pulled into Facebook, Instagram and YouTube while still a child and never really let go. According to her claim, features such as endless scrolling, alerts and recommendation loops were engineered with full awareness that they would be especially hard for young brains to resist.
She argues that this design culture deepened her struggles with depression and suicidal thoughts, and that the companies behind the apps should bear responsibility for worsening her mental health.
The companies reject that framing. Their response points to other pressures in Kaley’s upbringing and challenges in her early life, arguing that her experiences cannot be pinned on social media alone. They also emphasize years of public messaging and product changes aimed at protecting younger users.
The trial is being closely watched because it goes further than the usual arguments over harmful posts or videos. The focus here is not on what users uploaded, but on how the platforms themselves were built and operated. The judge overseeing the case has made that distinction clear for jurors: the question is whether the design of the apps caused harm, not whether they passed along content created by others.
If the jury sides with Kaley, the impact could ripple far beyond this one courtroom. Thousands of similar cases are waiting in California, brought by parents, school districts and state authorities who say social media has taken a toll on children and teenagers. A loss for the tech firms could weaken a long-standing legal shield that has protected internet companies from claims tied to user harm.
The stakes are high enough that senior leadership is expected to take the stand as the trial unfolds, which is likely to run into March. Two other platforms, TikTok and Snap, resolved their disputes with Kaley before the case reached this stage.
The legal pressure is not confined to California. On the same day this trial opened, another jury in New Mexico began hearing claims that Meta profited from its platforms while exposing young users to exploitation and psychological damage. Similar accusations are surfacing across the United States.
Beyond America, governments are also tightening the screws. Australia and Spain have already barred children under 16 from accessing social media platforms, and more countries are weighing similar restrictions as concern over youth mental health grows.
What happens in this Los Angeles courtroom may help determine whether social media remains largely untouchable by design — or whether the architecture of the apps themselves can finally be put on trial.


