A makeshift memorial at the steps of the town hall in Tumbler Ridge, B.C.Jennfier Gauthier/Reuters
Families of Tumbler Ridge shooting victims pursuing lawsuits against artificial-intelligence companies could face a long road ahead, according to a lawyer who has represented hundreds of clients in similar cases.
Matthew Bergman, a lawyer with the Seattle-based Social Media Victims Law Center, predicted the proceedings would likely play out over years, though not decades.
“We are in the very infant stage of this litigation,” Mr. Bergman said.
“We’re definitely not talking months.”
On Monday, the family of Maya Gebala, 12, one of the surviving victims of last month’s school shooting in Tumbler Ridge, B.C., filed a lawsuit against tech company OpenAI in British Columbia’s Supreme Court, claiming its ChatGPT chatbot helped incite one of the worst mass shootings in Canadian history.
Family of 12-year-old injured in Tumbler Ridge mass shooting sues OpenAI
The civil claim draws on media reports and statements of public officials and OpenAI representatives to argue that the company had specific knowledge of the shooter’s violent intentions but did not warn relevant law-enforcement agencies.
“ChatGPT equipped the Shooter with information, guidance, and assistance to plan a mass casualty event like the Tumbler Ridge Mass Shooting including … the types of weapons to be used, and describing precedents from other mass casualty events or historical acts of violence,” the civil claim says.
OpenAI has faced widespread criticism since it was revealed that, months before the Feb. 10 shooting, the company had flagged, but not reported to authorities, worrying interactions between the shooter and ChatGPT.
In a statement, OpenAI did not directly refer to the lawsuit.
“What happened in Tumbler Ridge was an unspeakable tragedy, and our thoughts remain with the victims, their families, and the entire community. OpenAI remains committed to working with government and law enforcement officials to make meaningful changes that help prevent tragedies like this in the future.”
OpenAI CEO Sam Altman will apologize to Tumbler Ridge families, David Eby says
Suing AI and social-media companies is the sole focus of Mr. Bergman’s firm, he said. The firm targets companies over alleged harms such as social-media addiction and what he calls chatbot-induced psychosis.
He said the firm has filed more than 1,500 lawsuits with clients across the United States, Britain and Australia. About 5 per cent of the cases involve Canadian plaintiffs, he said, though those cases have all been filed in California courts because that’s where OpenAI is headquartered.
The Tumbler Ridge case is the first he’s heard of a case against OpenAI being filed in a Canadian court, Mr. Bergman said, adding that there’s no reason to think such cases might be less successful than the suits he’s pursuing in California.
“There are different ways to go about this,” he said.
These types of wrongful-death or wrongful-harm lawsuits against AI platforms are still very new. Of the hundreds of cases Mr. Bergman’s firm is handling, the oldest is that of 23-year-old Zane Shamblin, which was filed in the fall of 2025. In that case, Mr. Bergman’s clients allege that Mr. Shamblin engaged in a four-hour “death chat” with ChatGPT that included explicit discussions about self-harm and encouragement from the chatbot, and ended with Mr. Shamblin’s suicide.
Jay Edelson is a lawyer representing several clients in wrongful-death cases against AI platforms. He said OpenAI is not the only platform that has allegedly pushed a person in mental distress into a state of dangerous psychosis.
His firm filed suit against another AI platform in what Mr. Edelson called a “potential mass casualty” incident, where a chatbot allegedly convinced a Florida man to attempt to steal a vehicle from an airport and to kill any witnesses.
“He showed up at the Miami-Dade airport with knives and tactical gear,” Mr. Edelson said.
The Tumbler Ridge lawsuit accuses OpenAI of rushing its ChatGPT-4o model into public use without necessary safety testing. Many of Mr. Bergman and Mr. Edelson’s cases also focus on that model, which Mr. Bergman alleges was defective.
The lawsuit filed in B.C. alleges that ChatGPT-4o has a tendency to echo problematic language back to users, sometimes encourages harmful behaviour, and often behaves in ways that appear to mimic a clinical therapist without any of the necessary safeguards in place.
In the spring of 2025, OpenAI released a new version called ChatGPT-5, but GPT-4o remained available until only recently, Mr. Bergman said.
Ultimately, Mr. Bergman said, he expects these cases to be long and complex, and while the outcomes might not be certain, they are worth pursuing.
“These cases are very cutting-edge, and there’s no guarantee that these cases are going to be financially remunerative,” he said. “If you’re concerned about justice and accountability, then I would encourage people to do it.”
With reports from Andrea Woo