Skip to main content
Open this photo in gallery:

People stand next to a makeshift memorial for the victims of the mass shooting in Tumbler Ridge, B.C., Feb. 12.Jennifer Gauthier/Reuters

OpenAI, whose AI chat platform hosted alarming interactions with the Tumbler Ridge shooter last summer, says it has made changes that would have flagged the content to law enforcement, and is promising that it will take further steps to satisfy concerns raised by the federal government this week.

In a letter released Thursday to Artificial Intelligence Minister Evan Solomon, the company promised immediate steps to “help prevent tragedies like this in the future.”

Mr. Solomon met with OpenAI officials this week after learning that the company had shut down the shooter’s account in June, 2025, over posts about gun violence but hadn’t reached out to RCMP. The company did so only after the 18-year-old killed eight people, including six at the local secondary school, before turning the gun on herself on Feb. 10.

The company started making changes several months ago that now mean the shooter’s exchanges on ChatGPT would be flagged to law enforcement, said Ann O’Leary, the company’s vice-president of global policy.

AI minister summons OpenAI safety chiefs over Tumbler Ridge shooting

“Mental health and behavioural experts now help us assess difficult cases, and we have made our referral criteria more flexible to account for the fact that a user may not discuss the target, means, and timing of planned violence in a ChatGPT conversation but that there may be potential risk of imminent violence. With the benefit of our continued learnings, under our enhanced law enforcement referral protocol, we would refer the account banned in June 2025 to law enforcement if it were discovered today,” she wrote.

Ms. O’Leary said the company is willing to work with the Canadian government and experts to continue strengthening its law-enforcement referral criteria. “This will include continuing to analyze how imminent and credible risk is assessed, and transparency regarding our reporting to law enforcement,” she said.

Ms. O’Leary also revealed that the shooter had opened a second account, after being banned for violating the company’s violent activities policy.

“After the name of the Tumbler Ridge perpetrator was released publicly, we discovered that the perpetrator had used a second ChatGPT account. We shared the second account with law enforcement upon its discovery.”

A press secretary for Mr. Solomon’s office, Sofia Ouslis, said the government is looking at the company’s plan. “We are reviewing OpenAI’s letter carefully and will have more to say in the coming days,” she said in a written statement.

Survivors of past Canadian school shootings reflect on tragedy in Tumbler Ridge

B.C. Premier David Eby said the promises to better self-regulate do not go far enough.

“They tragically missed the mark in bringing this information forward. The consequences of that will be borne by the people of Tumbler Ridge, the families of Tumbler Ridge, for the rest of their lives,” Mr. Eby told reporters in Victoria.

“It illustrates why these companies cannot be trusted to set their own reporting thresholds, and especially to set their own thresholds where there are no apparent consequences for not meeting them.”

He is calling on the federal government to set national standards that would ensure there is minimum threshold of reporting “to make sure that the protection of the community, the protection of children, comes before the interests of shareholders.”

The Premier’s officials met with OpenAI executives on Thursday, but Mr. Eby did not attend, insisting on meeting directly with Sam Altman, CEO of OpenAI, who has now agreed to meet. Mr. Eby said he wants to “express our deep disappointment and concern about their conduct in relation to this horrific attack in Tumbler Ridge.”

Before the revelations about the shooter’s ChatGPT exchanges became public, concerns had been raised about whether police and mental health services missed opportunities to intervene before the shooting. The teenaged shooter had struggled with her mental health and had been taken away for psychiatric treatment before being returned to a Tumbler Ridge home. In that home were guns, which had earlier been seized by police but then returned.

David Eby calls on Ottawa to regulate when AI providers must report their users to police

Mr. Eby has promised a public inquiry or coroner’s inquest, once the police investigation is completed.

Helen Hayes, associate director at the Centre for Media, Technology and Democracy, said OpenAI’s revelations in the letter reveal systemic failure, not an isolated error.

“The letter acknowledges it banned the Tumbler Ridge perpetrator’s account in June 2025 and explicitly states that under its old criteria it didn’t refer the matter to law enforcement. Now it tells us that under its new criteria, it would have. That’s a devastating admission dressed up as progress,” she said.

“After the perpetrator’s name was made public, the company discovered she had created a second ChatGPT account that had actually slipped through their existing detection systems entirely. OpenAI frames this as a transparency measure, but I think it’s actually proof that their safeguards failed twice.”

Katrina Ingram, founder of Ethically Aligned AI, a consultancy in Edmonton, said the company’s commitments are vague.

“The letter stops short of sharing any specifics about what its actual policies are and how they are applied,” she said.

While the company has offered some insight into its protocols, she noted that the process was amended before the Tumbler Ridge shootings.

“There is some interesting insight into the process regarding mental health and behavioural experts helping assess difficult cases,” she noted. “It’s unclear how these experts are helping – is it part of a human review process or have they worked with experts on developing better automated triaging?”

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe