Zuckerberg Faces Intense Questioning in Landmark Trial Over Social Media’s Impact on Teen Mental Health
The chief executive of Meta Platforms, Mark Zuckerberg, took the witness stand this week in a closely watched trial examining whether major social media platforms contributed to mental health harm among young users.
During testimony, Zuckerberg was questioned extensively by plaintiffs’ attorneys about internal concerns over the company’s handling of underage users. Lawyers pressed him on whether enough had been done to prevent children under 13 from accessing platforms such as Instagram.
Zuckerberg told the court that the company has strengthened its systems for detecting underage accounts but acknowledged shortcomings in earlier years. “I always wish we could have gotten there sooner,” he said, referring to efforts aimed at improving age verification and safety tools.
He stated that some users misrepresent their age when signing up and that the company removes accounts it identifies as belonging to children below the minimum age requirement. Plaintiffs’ lawyers challenged that explanation, questioning whether young users could reasonably be expected to understand and comply with complex platform policies.
The trial marks the first time Zuckerberg has addressed child safety concerns before a jury in a courtroom setting. Plaintiffs argue that Meta intentionally designed aspects of its platforms to encourage prolonged engagement, contributing to compulsive usage patterns among teenagers. They contend that this design approach amplified risks for vulnerable users.
The case centers on a 20-year-old woman, identified in court filings as KGM, who alleges that heavy use of Instagram and YouTube worsened her depression and suicidal thoughts during her teenage years. Her lawsuit is one of several “bellwether” cases selected to test how juries may respond to similar claims nationwide. Other technology firms, including TikTok and Snap Inc., are involved in related litigation, though some have reached settlements in this initial phase.
Meta’s legal team disputes that its platforms were a substantial factor in the plaintiff’s mental health struggles. Company attorneys argue that broader personal and environmental issues played a more significant role, citing medical records presented in court.
The proceedings also follow previous public scrutiny of social media companies in Washington. In early 2024, Zuckerberg appeared before lawmakers and issued an apology to families who say their children were harmed through online exploitation. Some of those families remain skeptical, arguing that corporate reforms have not gone far enough.
The company has introduced a series of youth-focused safety measures in recent years, including parental supervision tools and content controls. Critics, however, maintain that enforcement remains inconsistent and that platform design still prioritizes user engagement over well-being.
In separate legal action, prosecutors in New Mexico have accused Meta of violating consumer protection laws by failing to adequately disclose what it knew about potential risks to young users. The company has denied those allegations.
The outcome of the current trial could have wide-reaching consequences for the technology industry. Legal analysts say a ruling against Meta might pave the way for significant financial penalties and potential changes to how social media platforms are structured, particularly regarding features that encourage prolonged use.
As testimony continues, the case is being closely watched by lawmakers, mental health advocates, and technology companies alike. The verdict could help define the extent to which social media firms are held accountable not for user content, but for the underlying design of their platforms and its impact on children and teenagers.
