The family of a boy seriously injured in one of Canada’s worst mass shootings is suing OpenAI, arguing that the technology company could have prevented the attack on a school last month.
The lawsuit comes days after the head of OpenAI said he would apologize to families in a remote Canadian town after violence tore apart the tight-knit community.
Eight people, including five students between the ages of 12 and 13 and a 39-year-old teaching assistant, were killed by an 18-year-old shooter in the mountain town of Tumbler Ridge on February 10.
It was later learned that the shooter, Jesse Van Rootselaar, who died from a self-inflicted injury, had described violent gun-related scenarios to ChatGPT over several days in June, which an automated review system flagged, according to the Wall Street Journal.
But Open AI, which owns the chatbot, said it felt the account activity did not identify “credible or imminent planning” and so it banned Van Rootselaar’s account but did not notify Canadian authorities. The company later said it found a second account linked to the shooter after the first was suspended.
On Monday, Cia Edmonds filed a lawsuit against the company on behalf of herself and her two daughters, Maya and Dahlia Gebala, who were present during the shooting.
“The purpose of this lawsuit is to learn the full truth about how and why the Tumbler Ridge mass shooting occurred, impose liability, seek redress for damages and losses, and help prevent another mass shooting atrocity in Canada,” law firm Rice Parsons Leoni & Elliott LLP, which represents the family, said in a statement.
The allegations have not been proven in court.
Maya, 12, was shot three times. One bullet entered his head above his left eye and another hit his neck. A third bullet grazed his cheek and part of his ear, the lawsuit says.
She remains in the hospital after suffering a catastrophic traumatic brain injury, permanent cognitive and physical disability, right-sided hemiplegia, scarring and physical deformities, according to the claim.
Both Edmonds and her daughter Dahlia, who was not physically injured in the shooting, have experienced post-traumatic stress disorder, anxiety, depression and sleep disorders.
Edmonds’ civil suit alleges that OpenAI released ChatGPT to the market without adequate security studies. The family is seeking undisclosed punitive damages, saying the company’s conduct “is reprehensible and morally repugnant” to both the plaintiffs and the “community at large.”
Last week, OpenAI CEO Sam Altman met virtually with British Columbia Premier David Eby and Tumbler Ridge Mayor Darryl Krakowka amid growing frustration that existing policies within the tech giant did not require it to report violent content to police.
“Everyone on the call acknowledged that an apology is not enough, but also that it is completely necessary,” Eby said. “And the mayor of Tumbler Ridge is going to work with OpenAI to make sure that any public statements related to this are made appropriately and meaningfully, to the extent possible, (and) do not re-traumatize people in the community.”
When asked to comment on the lawsuit, a company spokesperson called the shooting an “unspeakable tragedy” and said Altman will work with Eby and Krakowka “to find the best way to convey his apology and support to the Tumbler Ridge community,” but did not give a timeline.
“OpenAI remains committed to working with provincial and local officials to make meaningful changes that will help prevent tragedies like this in the future.”
The company did not say whether the lawsuit would change Altman’s plans to apologize.
“OpenAI had the opportunity to notify authorities and potentially even prevent this tragedy from happening,” Eby told reporters after the meeting with Altman. The prime minister said that while the company could have done more, he pointed to a lack of mental health support and shooter access to firearms.
Eby, who gave an emotional speech to the community at a vigil in the days after the shooting, has become a staunch critic of the largely non-existent regulatory framework governing how AI companies operate in Canada and how OpenAI handled the situation.
“It is not acceptable that it is up to companies to report or not, and that must change.”
Eby refused to meet with members of the company’s management team and instead demanded that he speak directly to Altman. In the 30-minute call, the prime minister said he did not ask about interactions between the shooter and the OpenAI chatbot.
Under pressure from lawmakers, the company has already changed the way it works to better identify potential warning signs of serious violence. Canada’s AI Minister Evan Solomon said he had asked the company to apply its new safety standards retroactively and review previously flagged cases.
“This will determine whether additional incidents that would have been referred to authorities under OpenAI’s new safety standards were missed, and ensure they are promptly reported to the RCMP.”
While Eby said OpenAI’s leadership has been “responsive” to governments’ concerns, he cautioned that other companies with similar chatbots had not yet changed their policies.
“The status quo doesn’t work, it didn’t work, and it presents a huge threat that it could fail again,” Eby said. “And that’s why changes need to be made quite urgently.”






