![]() |
| OpenAI is being sued by the child's family in the Canada school shooting.(Image: AFP/Getty Images) |
The family of a girl critically injured during a mass shooting at a Canadian school is suing ChatGPT-maker OpenAI, claiming it had been aware the suspect had been planning an attack but failed to alert the authorities.
Maya Gebala, a 12-year-old, was shot in the neck and head on February 10 in Tumbler Ridge and is still in the hospital. Due to the nature of her conversations with the chatbot, an initial ChatGPT account associated with the suspect, 18-year-old Jesse Van Rootselaar, was banned by OpenAI in June 2025, but Canadian police were not notified. The BBC was informed by OpeanAI that the company was committed to making "meaningful changes" to assist in preventing future tragedies of a similar nature.
Read More: OpenAI vows safety policy changes after Tumbler Ridge shooting
Eight people were killed in the attack, including five young children and the suspect's mother, in one of the deadliest shootings in Canadian history.
The civil lawsuit, brought by Gebala's mother Cia Edmonds, alleges Rootselaar set up an account with ChatGPT before she turned 18 - something users can do with parental consent.
The plaintiffs allege no age verification took place on the site.
The suspect, according to the lawsuit, described "various scenarios involving gun violence" to the chatbot over several days in late spring or early summer 2025 and viewed it as a "trusted confidante." The lawsuit claims that twelve OpenAI employees then suggested that Canadian law enforcement be informed and flagged the posts as "indicating an imminent risk of serious harm to others." Instead, it is alleged the request to contact the authorities was "rebuffed" and the only action taken was to ban Rootselaar's account.
OpenAI has previously said it did not alert police because the account did not meet its threshold of a credible or imminent plan for serious physical harm to others.
Despite previous warnings from OpenAI systems, the suspect was able to create a second ChatGPT account and "continue planning scenarios involving gun violence." According to the lawsuit, the business "took no steps to act upon this knowledge" despite "specific knowledge of the shooter's long-range planning of a mass casualty event." The plaintiffs state as a result of the company's conduct, Gebala, who was shot at three times after trying to lock a library door to keep out the shooter, has suffered a "catastrophic brain injury".


0 Comments