Skip to main content
Open this photo in gallery:

A police vehicle parked outside a high school, the site of a deadly mass shooting in the town of Tumbler Ridge, B.C., on Feb. 11.Jennifer Gauthier/Reuters

A representative of tech giant OpenAI met with the B.C. government one day after an 18-year-old killed six people in a school shooting in Tumbler Ridge, but the company did not disclose that it had suspended the shooter’s ChatGPT account months earlier because of concerning content.

The meeting had been planned before the incident, the province said Saturday. The day after the meeting – two days after the shooting – representatives with OpenAI asked their provincial contact for help with connecting with the RCMP, Premier David Eby’s office said.

The shooter killed five students and a teacher’s aide at the school, and, beforehand, killed her mother and half-brother at their nearby home. The shooter, who RCMP have identified as Jesse Van Rootselaar, then killed herself at the school as police responded to the scene.

The Wall Street Journal reported Friday that OpenAI employees wanted the company to alert police in June over the shooter’s posts involving gun violence but they were rebuffed.

Tumbler Ridge shooter’s ChatGPT messages were flagged months before attack

OpenAI’s decision to shut down the shooter’s account, but not to flag the concerns to law enforcement, is deeply alarming, Mr. Eby and federal AI Minister Evan Solomon said in separate statements. The news comes as governments around the world struggle with how to regulate the powerful, fast-developing technology.

“Reports that allege OpenAI had related intelligence before the shootings in Tumbler Ridge took place are profoundly disturbing for the victims’ families and all British Columbians,” Mr. Eby said in the statement.

“The pain that these families have gone through is unimaginable.”

Mr. Eby said police are working to preserve any potential evidence related to the shootings and held by digital services companies, including social media platforms and AI companies.

“We will use all powers of government to ensure that police have the tools they need to investigate every aspect of this horrific tragedy,” he said.

Open this photo in gallery:

Flowers are laid in Tumbler Ridge the day after the mass shooting.Jesse Winter/The Globe and Mail

Mr. Solomon said in a statement he’s also “deeply disturbed by reports that concerning online activity from the suspect was not reported to law enforcement in a timely matter.”

The minister said he is in contact with OpenAI and other companies about safety procedures.

“All options are on the table to ensure public safety and the protection of our children.”

OpenAI confirmed Friday that the shooter had been banned from using its ChatGPT chatbot last June after her posts were flagged by OpenAI’s automatic screening systems.

The company said in its Friday statement that it did not notify law enforcement last June because the company did not identify “credible or imminent planning.”

For posts to trigger a referral to law enforcement, they must indicate “an imminent and credible risk of serious physical harm to others,” the company said.

How a day of fear and grief unfolded in Tumbler Ridge

The mass shooting occurred Tuesday, Feb. 10.

On Feb. 11, a B.C. government representative met with staff from the San Francisco tech giant to discuss the company’s interest in opening a satellite office in Canada.

On Feb. 12, OpenAI requested contact information for the RCMP.

OpenAI did not inform any member of government that they had potential evidence regarding the shootings in Tumbler Ridge, the B.C. government said.

In a statement to The Globe and Mail Saturday, OpenAI said as soon as it became aware of the shooter’s identity in media reports, it reached out to the U.S. Federal Bureau of Investigation to pass along information to the RCMP.

The company said that is how it has handled such cross-border law enforcement communications about its users in the past.

On Feb. 12, the day after its meeting with the B.C. government, OpenAI realized a more direct high-level contact with the RCMP would be needed, the company statement said.

OpenAI is undertaking a review of this case to determine whether its processes could be improved, the statement said.

Tumbler Ridge high-school building unlikely to reopen, official says

The federal government has backed away from legislation specifically focused on AI, but plans to introduce bills focusing on privacy and online harms.

Taylor Owen, an associate professor at McGill University and a member of the federal task force advising Ottawa on its upcoming AI strategy, has said that online-harms legislation should address AI platforms.

“AI systems pose significant risks,” he wrote in his submission to government last fall.

He noted that studies have shown chatbots “fail to respond appropriately to users experiencing mental health crises, reinforce cognitive distortions through mirroring language, and cultivate a false sense of emotional reciprocity.”

A U.S. lawyer representing two families who are suing OpenAI said this is not the first time the company has failed to alert authorities when users who discussed violence and self-harm with ChatGPT later carried out that violence in the real world.

Jay Edelson represents the family of Adam Raine, a 16-year-old who died by suicide in 2025. The family alleges OpenAI’s chatbot coached Raine to kill himself.

Mr. Edelson also represents the estate of Suzanne Adams, who was murdered by her adult son in 2025. In the lawsuit, the family alleges ChatGPT convinced him that his mother was part of a conspiracy to kill him.

The allegations have not been proven in court.

In an interview with The Globe and Mail, Mr. Edelson said OpenAI’s decision not to disclose what it knew earlier about the Tumbler Ridge shooter is alarming.

“We are very convinced that this is a widespread problem,” he said.

“How many other people out there right now are speaking to ChatGPT ... about potentially planning mass casualty events. And if the answer is anything other than zero, that’s a problem for them [the company].”

OpenAI said in its statement that ChatGPT is trained to discourage posters from intending harm and “to avoid providing advice that could result in immediate physical harm to an individual.”

The company’s statement said the risk of overreporting to law enforcement can cause “distress” to a young person and family if officers show up unannounced. It can also “introduce unintended harm,” and prompt privacy concerns, the statement said.

Candice Alder, a B.C.-based psychotherapist and AI ethics consultant with Synthetica.io, agrees.

She cautioned against relying on AI platforms to become “informal extensions of law enforcement,” and that doing so risks compromising important Charter-protected rights like privacy and free expression.

“If we lower the reporting threshold for AI platforms to include speech that is merely concerning, we risk normalizing a form of privatized behavioural surveillance,” Ms. Alder said in an e-mailed statement to The Globe.

She added that AI platforms are not a replacement for professional mental health services and are not equipped to do clinical risk assessments.

Ms. Alder also noted that in the Tumbler Ridge case, police and mental health professionals were already involved with the shooter. The shooter had been in psychiatric treatment, and weapons had been seized from the home.

“We should be cautious about retroactively shifting responsibility onto an AI platform when established legal and law enforcement mechanisms were already in motion,” Ms. Alder wrote.

The RCMP confirmed Friday that OpenAI contacted its investigators after the shooting.

RCMP Staff Sergeant Kris Clark said a “thorough review of the content on electronic devices, as well as social media and online activities” of the shooter is taking place.

He said “digital and physical evidence is being collected, prioritized, and methodically processed.”

The RCMP said Saturday they are also investigating threats that have circulated online and within the community of Tumbler Ridge.

Police say a safety plan is in place for those who have been affected as well as the community as the investigation continues.

Police did not offer specifics on the threats, but say officers have connected with the mayor and community leaders to ensure continuing communication and public safety planning.

With a report from The Canadian Press

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe