Parents who say they have lost their children due to social media hold up a banner with the names of the children outside the court after the jury found Meta and Google liable in a key test case accusing Meta and Google's YouTube of harming children's mental health through addictive social media platforms, in Los Angeles, California, U.S., March 25, 2026. REUTERS/Mike Blake TPX IMAGES OF THE DAYMike Blake/Reuters
Jamil Jivraj is a child and adolescent psychiatrist in Calgary and clinical assistant professor at the University of Calgary.
On Wednesday, a jury in Los Angeles found Meta and YouTube negligent in the design of their platforms. They awarded US$6-million to a young woman who testified that nearly nonstop social media use since she was a child caused her depression, anxiety, and a devastated sense of self-worth. Meta’s response after the verdict was familiar: her struggles, they said, could not be directly linked to a specific app.
In my clinic, I have been hearing a similar reluctance to blame social media platforms for years. Not from lawyers. From parents.
I see it every week. A parent sits down, describes a child who has stopped sleeping, stopped talking, stopped seeming present, and then pauses. “I don’t know if I’m allowed to blame the phone,” they say. “Maybe I’m just out of touch.”
They are not out of touch.
Opinion: The Meta ruling on social media’s harm to children was historic. But for parents, what now?
Something has shifted in the emotional lives of children across this country. Parents feel it before they can name it. A dinner table that empties faster. A bedtime that keeps sliding. A group of teenagers sitting together, each one staring into a screen.
In my clinic, the children I see are getting younger.
A 2025 study in the Journal of the American Medical Association followed more than 4,000 children beginning at age 10 and found that what predicted harm from social media was not the amount of screen time alone. It was compulsive use: feeling unable to stop, using platforms to escape, experiencing distress when away from them. Children with that pattern of use were two to three times as likely to have thoughts of suicide or to harm themselves by age 14. The study cannot prove cause and effect, but the association is significant. The risk is not in the hours. It is in the design.
These platforms are built on intermittent reward. A like might come. It might not. The unpredictability is the point. For adults, that is annoying. For adolescents, whose sense of self is still being built from the outside in, it is something closer to urgent. That is not a design flaw. It is a design choice.
The courtroom in Los Angeles has now confirmed what families have known for years. And these individual households have had to take on the burden of managing this potentially harmful environment that they had no part in designing.
When they are unable to manage it, that is not a parenting failure. It is a policy failure.
How to talk to kids about their social media use, according to an expert
We have already seen what happens when children are given protected space from social media. Last year, provinces across Canada banned phones in classrooms. Teachers described what came back: noise in the hallways, conversation at lunch.
More than eight in ten Canadians supported those bans. The question that followed naturally was: why stop at the school door?
Australia acted, setting a minimum age of 16 for social media. Norway is moving toward similar legislation. Canada is still drafting. The federal government has been considering an Online Harms bill for years. Here, the verdict in Los Angeles is not a reason to start that conversation. It is a reason to finish it.
A focused approach that places responsibility on platforms, rather than on parents or children, is within reach.
Some will say that young people will find workarounds, that no age limit is perfectly enforced. Both are true. But we do not require perfection from other age-based protections. We set minimum ages for alcohol and driving because the norm itself matters. It shapes expectations and what industries are required to do.
Opinion: A blanket ban won’t solve social media’s ills – but it can be an effective temporary tool
Sixteen is not an arbitrary line. It reflects a period of heightened vulnerability to social comparison, impulsivity and the pull of approval. The platforms have known for years exactly who they are designing for.
It is also true that social media can be a lifeline for isolated teenagers. If a minimum age is put into place, once young people are on the platforms, the companies building them still need to be held accountable for how they design them.
The parents in my clinic are managing something larger than any household can handle alone. They have been doing it for years, without much help. A jury has now confirmed what they’ve been struggling with.
The verdict tells us what these companies did. What comes next is a choice about what we require of them, and that decision belongs to legislators, not juries.