Sam Altman has issued an apology following OpenAI's decision not to report the ChatGPT user involved in the Tumbler Ridge school shooting.
**TL;DR** Sam Altman expressed his apologies to the community of Tumbler Ridge, British Columbia, following OpenAI's failure to inform law enforcement about a ChatGPT user flagged by its systems, who later committed Canada's deadliest school shooting in decades, killing eight and injuring 27. In June 2025, about a dozen OpenAI employees reviewed the flagged account and some recommended contacting police, but leadership rejected this, stating the conversations did not meet their “higher threshold” for reporting. OpenAI has lowered this threshold and established communication with the RCMP, though these changes are voluntary and Canada lacks a law mandating AI companies to report identified threats.
Altman issued an open letter to the Tumbler Ridge community, apologizing for not alerting law enforcement regarding a user banned in June. He stated, "I am deeply sorry that we did not alert law enforcement to the account that was banned in June," acknowledging the harm and irreversible loss suffered by the community. This letter, dated April 23 and released publicly the following day, came over two months after Jesse Van Rootselaar, 18, executed a shooting spree beginning at a family home and culminating at Tumbler Ridge Secondary School on February 10. OpenAI’s automated system had flagged Van Rootselaar’s ChatGPT account eight months earlier, in June 2025. A dozen employees reviewed the flagged conversations that indicated potential gun violence, with some recommending police contact, which the leadership ultimately dismissed. Despite the account being banned, no authorities were informed, allowing Van Rootselaar to create a second account that went undetected until after the RCMP revealed his identity.
**The Decision**
The Wall Street Journal was the first to report on OpenAI's internal discussions. Employees who reviewed Van Rootselaar’s flagged account recognized signs of a potential risk of serious harm to others and advocated for reporting the conversations to law enforcement. However, leadership applied what an OpenAI spokesperson described as a “higher threshold” for credible threat assessment, determining the activities did not qualify. The account was terminated, the conversations were retained internally, but law enforcement was not notified. Months later, Van Rootselaar killed his mother and half-brother in their home before proceeding to the secondary school and shooting six individuals, resulting in the hospitalization of 27 others. Among the injured was 12-year-old Maya Gebala, who was critically harmed while attempting to shield classmates. Van Rootselaar ultimately died by suicide at the school.
A civil lawsuit was filed in the BC Supreme Court in March by Cia Edmonds on behalf of her daughter Maya, alleging that ChatGPT provided information to assist in planning a mass casualty event, including weapon types and examples from previous violent incidents. The specific conversation content remains undisclosed, but BC Premier David Eby purposefully refrained from asking about chat logs to avoid jeopardizing the RCMP investigation. It is evident that OpenAI’s system flagged conversations as potentially dangerous and that their staff recommended a course of action, all of which was ignored by leadership. The apology is not about the failure to identify the threat; the detection functioned correctly. Rather, the apology addresses the lack of action taken post-detection.
**The Letter**
Altman’s letter, directed to the Tumbler Ridge community, was made public after Premier Eby revealed that Altman had agreed to apologize following discussions about OpenAI’s response to the incident. In the letter, Altman stated, “I have been thinking of you often over the past few months,” expressing his inability to fathom the pain of losing a child. He reiterated his commitment to collaborating with government officials to prevent similar tragedies moving forward. However, the letter lacked specific policy commitments or acknowledgment that employees had been overruled on their recommendation to contact authorities. Eby deemed the apology necessary but grossly inadequate in addressing the devastation faced by the families of Tumbler Ridge. Mayor Darryl Krakowka confirmed receipt of the letter and requested mindful consideration as the community processes their grief.
Separate policy commitments were later provided in a letter from OpenAI's vice-president of global policy, Ann O’Leary, to Canadian federal ministers. O’Leary noted OpenAI's reduction of the reporting threshold so that it is no longer necessary for a user to explicitly discuss “the target, means, and timing” of potential violence for a conversation to be reported. The company has brought in mental health and behavioral experts to evaluate flagged cases and forged a direct relationship with the RCMP. O’Leary indicated that had Van Rootselaar's interactions been discovered under the new policy, they would have been reported to the police. However, these updates remain discretionary, without binding legal force, and could be reversed at any point. There is currently no legislation in Canada compelling AI firms to report identified threats via their platforms.
**The Pattern**
Tumbler Ridge is not an isolated incident. In Florida, an investigation has been initiated
Другие статьи
Sam Altman has issued an apology following OpenAI's decision not to report the ChatGPT user involved in the Tumbler Ridge school shooting.
OpenAI identified the Tumbler Ridge shooter's ChatGPT account as a concern 8 months prior to the incident. Staff suggested notifying law enforcement, but their superiors dismissed the recommendation. Altman's apology does not bridge the gap.
