Skip to main content
Open this photo in gallery:

People gather at a makeshift memorial for the victims of a mass shooting in Tumbler Ridge, B.C., Feb. 12.Jennifer Gauthier/Reuters

Sam Altman, the chief executive officer of OpenAI, one of the most valuable privately held companies in the world, will deliver an apology to families in Tumbler Ridge after hearing about the impact of a deadly school shooting, B.C. Premier David Eby says.

Mr. Altman, Mr. Eby and Tumbler Ridge Mayor Darryl Krakowka spoke in a 30-minute video call on Thursday about OpenAI’s role in the mass shooting on Feb. 10. The perpetrator’s conversations on the company’s ChatGPT platform months earlier had raised red flags within the company, but were not reported to law enforcement.

Mr. Eby told reporters after the meeting that he asked for the apology because “OpenAI had the opportunity to notify authorities and potentially even to stop this tragedy from happening,” although he acknowledged there are other major issues including mental health supports and the shooter’s access to weapons in the home that are also under scrutiny.

B.C. chief coroner to hold inquest into Tumbler Ridge mass shooting

The Premier did not ask Mr. Altman about the content of those chats, but he said the RCMP have assured him that they have issued preservation orders to all the social media and AI companies involved and that police have incorporated them into the investigation.

“I made the very specific decision not to ask about the content of the chats with Mr. Altman. I don’t want to play any role in interfering with the criminal investigation that’s under way,” Mr. Eby said. “I want the police to release information as they feel that it’s appropriate.”

Mr. Eby demanded the meeting after refusing to meet with lower-tier executives with the company. He also asked OpenAI to support his call for federal regulatory standards that would establish a “duty to report” minimum threshold across the country for all AI companies.

Mr. Eby said the company agreed to make recommendations and to provide advocacy around federal regulatory standards. “I don’t believe that OpenAI’s current standard is sufficient where there is an option to report,” he said. As well, he said, there needs to be consistent standards for all companies that provide similar chatbot services through artificial intelligence.

“It’s not acceptable that it’s up to the companies about whether or not to report, and that needs to change.”

On Wednesday, federal AI Minister Evan Solomon met with Mr. Altman to set out Ottawa’s demands, including the need to have Canadian experts assess ChatGPT conversations that have been flagged for signs that users intend to cause imminent harm to determine whether to alert law enforcement.

AI Minister tells OpenAI Canadian experts must assess flagged ChatGPT conversations

How artificial-intelligence companies interact with law enforcement has become a major concern after revelations that California-based OpenAI did not tell Canadian authorities about troubling conversations that 18-year-old Jesse Van Rootselaar had with ChatGPT months before fatally shooting eight people in Tumbler Ridge, B.C., and then killing herself. Six of the victims were under 14 years old. The shooter’s account was closed down for a violation of ChatGPT’s usage policy, but the company later said the content did not reveal “credible and imminent planning” of violence according to policies that were in place at the time.

The company said it has already changed its policies to better identify potential warning signals of serious violence. A spokesman for OpenAI was unavailable Thursday for comment about the planned apology.

Mr. Solomon said Wednesday that he told Mr. Altman that Canadian experts in mental health, law and privacy have to weigh in on such sensitive matters. Mr. Solomon did not say whether the government will introduce regulations for when AI companies report to law enforcement, which some experts have recommended.

Canada does not have overarching AI legislation, nor does it have a set of rules that apply specifically to chatbots, unlike some other jurisdictions. Some experts have said that forthcoming online harms legislation should cover chatbots as well as social-media platforms.

Follow related authors and topics

Interact with The Globe