Site icon Delligsen online News

Tumbler Ridge B.C. lawsuits filed in California court against OpenAI

Seven families impacted by the Tumbler Ridge shooting in February filed lawsuits against OpenAI and its founder, Sam Altman, in a San Francisco court on Wednesday.

Eight people were killed when 18- year old Jesse Van Rootselaar opened fire at a school in Tumbler Ridge on Feb. 10.

A cross-border legal team is pursuing action against OpenAI and Altman. Seven lawsuits have been filed on behalf of five murder victims and two who were injured.

“These families from the Canadian north have come together and they’ve decided to pursue litigation in the United States on a scale that can hold these companies to account,” Vancouver-based lawyer John Rice with Rice Parsons Leoni & Elliott LLP, told Global News.

The claims, which have not been tested in court, state that in the weeks that followed the attack in Tumbler Ridge, “a sickening truth emerged: ChatGPT played a role in the mass shooting and OpenAI could have, and should have, prevented it.”

Story continues below advertisement

In February, OpenAI confirmed that an account connected with Van Rootelsar was identified the previous June and was subsequently banned for violating the usage policy.

The company considered referring the account to law enforcement but determined the account activity did not meet the threshold for referring a user to law enforcement because it did not involve an imminent and credible risk or planning of serious physical harm to others.

In a statement in February, RCMP Staff Sgt. Kris Clark said the platform did reach out to the RCMP after the shooting.




Maya Gebala heading to LA for treatment


“Sam Altman and his leadership team knew what silence meant for the citizens of Tumbler Ridge,” the lawsuits state.

“They were focused on what disclosure meant for themselves. Warning the RCMP would set a precedent: OpenAI would be compelled to notify authorities every time its safety team identified a user planning real-world violence.

Story continues below advertisement

“Given the volume of chat-induced violence on ChatGPT, that would require a dedicated law-enforcement referral team tasked with reporting OpenAI’s own users to authorities. And the public would finally see what OpenAI was desperately trying to hide: that ChatGPT is not the safe, essential tool the company sells it as, but a product dangerous enough that its makers routinely identify its users as threats to human life.”

Get daily National news

Get daily Canada news delivered to your inbox so you’ll never miss the day’s top stories.

The suits make several claims of negligence, product liability and violation of California’s business and professional code.

Chicago lawyer Jay Edelson, with Edelson PC and Vancouver lawyer Rice, met with families in Tumbler Ridge before filing the suit in California, where OpenAI is based.

“We spent the last two days meeting with the victims of the Tumbler Ridge shooting,” Edelson said.

“It has been some of the most difficult days of our professional lives.”

The lawsuits claim OpenAI’s safety team urged leadership to notify the RCMP but for OpenAI, “this was a question of corporate survival.”

According to Reuters, OpenAI is laying the groundwork ​for an initial public offering that could value it up to $1 trillion.

Edelson added that they will be asking the jury to “send a strong message to Open AI that it can’t make a decision to put profits over the lives of little kids and it’s hard to imagine that we won’t ask for at least a billion dollars.”

Story continues below advertisement




OpenAI agrees to safeguards in wake of Tumbler Ridge shooter not being flagged to police


The lawsuits likened OpenAI’s decision not to notify RCMP to Ford’s decision in the 1970s to keep selling the Pinto after its engineers warned that the fuel tank design would cause people to burn to death in rear-end collisions.

“This tragedy was not just predictable, it was preventable,” Rice said.

The suits claim “company leaders overruled the safety team members”, “deactivated the Shooter’s account, and kept what they had seen to themselves.”

“When the story eventually broke, Altman and OpenAI lied. First they claimed to have “banned” the Shooter’s account.”

However, the lawsuits claim that OpenAI does not ban users. “It only “deactivates” them – a process that can be reversed within minutes by registering a new account. The Shooter did exactly that, and continued using ChatGPT to plan the attack.”


Story continues below advertisement

“When we see the chats, I am very convinced that you’re going to see that ChatGPT wasn’t just listening to the shooter but actively pushing the shooter into this mindset,” Edelson said.

The lawsuits claim that OpenAI already had “clear knowledge” that people were using its product to plan and prepare real-world violence.

They cite a case that happened in January 2025, when a man used ChatGPT for feedback on how to use explosives and evade surveillance before detonating a Tesla Cybertruck in front of the Trump International Hotel in Las Vegas.

They also use another example, a case from April 2025, in which a 20-year-old gunman carried out a mass shooting at Florida State University. Chat logs showed that the gunman had used ChatGPT “extensively” in the lead-up and during the attack and the gunman had asked questions about how to fire a shotgun, the legal fates of school shooters and when the student union would be busiest, according to the lawsuits.

The suits also reference an incident that happened in May 2025, in which a teen boy in Finland used ChatGPT for nearly four months to help prepare for an attack in which he stabbed three 14-year-old girls at his school, according to the lawsuits.

Finnish authorities reported that the boy had made hundreds of chatbot queries, including research into stabbing tactics, concealment of evidence and information on mass killings.

Story continues below advertisement




OpenAI CEO Sam Altman apologizes to Tumbler Ridge victims’ families


Some legal action was initially started in Canada, then severed to pursue litigation stateside.

“In terms of expressing society’s condemnation, deterring corporate malfeasance, a Canadian court can’t do that for a company of this size,” Rice explained.

On April 24, Altman issued an apology letter to Tumbler Ridge, saying he is “deeply sorry that we did not alert law enforcement to the account that was banned in June.

“While I know words can never be enough, I believe an apology is necessary to recognize the harm and (irreversible) loss your community has suffered.”

Twelve-year-old Maya Gebala was shot three times at close range in the library of Tumbler Ridge Secondary School.

She has been fighting for her life in the hospital ever since.

Story continues below advertisement

In response to that apology letter, Gebala’s mother, Cia, released a statement saying in part, “to think, a simple phone call could have prevented this.”

“Tumbler Ridge sees your “apology”, Sam. We do not accept it.”

All seven lawsuits request jury trials, which the legal team expects will move forward next year.

Exit mobile version