Sam Altman has issued an apology following OpenAI's decision not to report a ChatGPT user involved in the Tumbler Ridge school shooting.

Sam Altman has issued an apology following OpenAI's decision not to report a ChatGPT user involved in the Tumbler Ridge school shooting.

      **TL;DR** Sam Altman expressed his regret to the Tumbler Ridge community in British Columbia for OpenAI's failure to notify law enforcement after its systems flagged a ChatGPT user, who subsequently killed eight people and injured 27 in Canada's worst school shooting since 1989. In June 2025, about a dozen OpenAI employees reviewed the flagged account and some suggested reporting it to the police, but management overruled them, citing a “higher threshold” that the conversations did not satisfy. Since then, OpenAI has reduced its reporting threshold and reached out to the RCMP. However, these changes are voluntary, and Canadian law does not require AI companies to report identified threats.

      Altman issued an open letter to Tumbler Ridge on Thursday, apologising for OpenAI's failure to inform law enforcement after its systems flagged a user involved in Canada’s most tragic school shooting in nearly 40 years. “I am deeply sorry that we did not notify law enforcement about the account that was banned in June,” Altman wrote. “Though I understand words cannot replace what has been lost, I feel an apology is essential to acknowledge the pain your community has endured.” The letter, dated April 23 and made public a day later, was sent 72 days after Jesse Van Rootselaar, 18, fatally shot eight people and injured another 27 at Tumbler Ridge Secondary School on February 10. OpenAI’s automated abuse detection system had flagged Van Rootselaar's ChatGPT account eight months prior, in June 2025. Around twelve employees reviewed the flagged conversations discussing gun violence scenarios, with some recommending alerting Canadian authorities, but leadership decided otherwise. The account was banned, but no one was informed. Van Rootselaar then created a second account and was only detected after the RCMP released his name.

      **The Decision**

      The Wall Street Journal highlighted the internal deliberations at OpenAI. Employees who evaluated Van Rootselaar’s flagged account noted indications of "an imminent risk of serious harm to others" and proposed reporting the conversations to law enforcement. Leadership, however, applied what an OpenAI spokesperson later described as a “higher threshold” for credible and imminent threat reporting and determined the conversations did not meet this criterion. Consequently, the account was terminated, the conversations preserved internally, and the police were not contacted. Eight months later, Van Rootselaar murdered his mother, Jennifer Strang, 39, and his 11-year-old half-brother, Emmett Jacobs, at their family home before proceeding to the secondary school where he opened fire with a modified rifle, killing education assistant Shannda Aviugana-Durand, 39, and five students aged 12 and 13. Twenty-seven individuals were injured, including 12-year-old Maya Gebala, who was shot three times while protecting her classmates and suffered severe brain injuries. Van Rootselaar later died by suicide at the school.

      A civil lawsuit filed in March in the BC Supreme Court by Cia Edmonds on behalf of her daughter Maya claims that ChatGPT provided “information, guidance, and assistance to plan a mass casualty event, including the types of weapons to be used.” The details of the conversations have not been disclosed. BC Premier David Eby stated he deliberately refrained from asking about the chat logs to avoid hindering the RCMP investigation. It is clear that OpenAI's system flagged the conversations as potentially dangerous, employees recommended action, but leadership chose not to pursue it. The apology reflects a failure not in detection—since that function worked well—but in the actions taken after threats were detected.

      **The Letter**

      Altman’s letter addressed the Tumbler Ridge community and was released after Premier Eby revealed that Altman had agreed to issue an apology following discussions about OpenAI’s handling of the situation. “I have thought of you often in the past few months,” Altman mentioned. “I cannot imagine anything worse than losing a child.” He reiterated his commitment to work with local governance to prevent similar tragedies in the future, although the letter did not include specific policy changes or acknowledge the overruled recommendations by employees. Premier Eby termed the apology “necessary” but “grossly insufficient” for the devastation experienced by families in Tumbler Ridge. Mayor Darryl Krakowka acknowledged the letter and requested “care and consideration” as the community navigates its grieving.

      Policy commitments came later in a letter from OpenAI Vice President Ann O’Leary to Canadian federal ministers, stating that OpenAI had lowered its reporting threshold so that conversations no longer need to detail “the target, means, and timing” of intended violence for law enforcement referral. The company has also engaged mental health and behavioral experts to assess flagged cases and established direct communication with the RCMP. O’Leary indicated that under the revised policies, Van Rootselaar's interactions “would have been referred to police” if they had occurred today

Sam Altman has issued an apology following OpenAI's decision not to report a ChatGPT user involved in the Tumbler Ridge school shooting.

Other articles

Beijing issues a warning to the EU following the inclusion of 27 Chinese companies in the 20th sanctions package against Russia and responds by taking action against European defense firms. Beijing issues a warning to the EU following the inclusion of 27 Chinese companies in the 20th sanctions package against Russia and responds by taking action against European defense firms. China criticized the European Union's 20th sanctions package, which affects 27 Chinese entities, and responded within a day by targeting seven EU defense companies. Europe's rearmament relies on the rare earth materials that Beijing possesses. Foundation Future Industries secures $24 million contracts from the Pentagon for humanoid robot soldiers, supported by Eric Trump and evaluated in Ukraine. Foundation Future Industries has obtained $24 million in contracts from the Pentagon for humanoid robots that were tested in Ukraine. The company's CEO previously led a bankrupt fintech, one of its advisers is Eric Trump, and it is seeking $500 million at a valuation exceeding $3 billion. Sequoia distributes 200 engraved Mac Minis at an AI event as OpenClaw emerges as the infrastructure layer that VCs cannot possess. Alfred Lin distributed 200 engraved Mac Minis that operate on OpenClaw, the open-source AI agent framework that has surpassed Apple's stock in sales and outperformed React on GitHub. Sequoia is unable to invest in it. That's the approach. China establishes protections for gig workers, impacting 200 million platform employees, with an emphasis on algorithm transparency and a deadline set for 2027. China establishes protections for gig workers, impacting 200 million platform employees, with an emphasis on algorithm transparency and a deadline set for 2027. China's top governing authorities have required minimum wage, maximum working hours, and transparency in algorithms for over 200 million gig workers. Applications are prohibited from dispatching orders once drivers reach their time limits. Compliance is expected by 2027. Foundation Future Industries secures $24 million contracts from the Pentagon for humanoid robot soldiers, with support from Eric Trump and trialed in Ukraine. Foundation Future Industries has obtained $24 million in Pentagon contracts for humanoid robots that were tested in Ukraine. The company's CEO previously led a bankrupt fintech, an adviser to the firm is Eric Trump, and it is seeking $500 million at a valuation exceeding $3 billion. Sequoia hands out 200 engraved Mac Minis at the AI event as OpenClaw establishes itself as the infrastructure layer that venture capitalists cannot possess. Alfred Lin distributed 200 engraved Mac Minis that operate on OpenClaw, the open-source AI agent framework that has outperformed Apple's stock and exceeded React on GitHub. Sequoia is unable to invest in it. That is the approach being used.

Sam Altman has issued an apology following OpenAI's decision not to report a ChatGPT user involved in the Tumbler Ridge school shooting.

Eight months prior to the attack, OpenAI marked the ChatGPT account of the Tumbler Ridge shooter as a concern. Staff suggested notifying law enforcement, but their recommendations were dismissed by management. Altman's apology cannot bridge this gap.