OpenAI sued over Canada school shooting
OpenAI sued over Canada school shooting
The family of a girl critically wounded in a Canada school shooting has filed a lawsuit against ChatGPT-maker OpenAI, accusing the artificial intelligence firm of being aware of the suspect’s planned shooting but failing to alert authorities.
The suit was filed Monday in the Supreme Court of British Columbia by the parents of 12-year-old Maya Gebala, who was one of the more than two dozen individuals injured in a shooting at the Tumbler Ridge Secondary School on Feb. 10. Eight people, including a teacher and five students, were killed.
Gebala’s lawyers pointed to the two ChatGPT accounts allegedly used by 18-year-old suspect Jesse Van Rootselaar, who they say described various situations involving gun violence to ChatGPT over the course of several days. ChatGPT’s monitoring system flagged the posts, which were sent to employees for review, per the suit.
The suit stated about 12 OpenAI employees identified the posts as posing an “imminent risk of serious harm to others” and recommended law enforcement be contacted. When the recommendations were sent to leadership at OpenAI, the defendants allegedly “rebuffed their employees request,” to contact law enforcement and only banned the first account.
The suspect allegedly opened a second ChatGPT account to continue planning gun violence scenarios, including a mass causality.
”OpenAI harvested such harmful information and data in an indiscriminate manner and then supplied such information and data to ChatGPT,” the suit stated. “OpenAI took no steps — adequate or at all — to avoid providing ChatGPT with such information and data, or impose any safeguards to prevent users from obtaining such information from ChatGPT.”
Gebala’s family said she was shot three times, leaving her with a traumatic brain injury, permanent cognitive and physical disability, and other mental and physical injuries.
The suit went into further detail over ChatGPT’s development, including the GPT-4o model, which incorporated a memory tool to tailor responses over time to a user, and sycophancy, which the company admitted made responses overly supportive when reversing the change.
Lawyers alleged OpenAI’s features were “intentionally designed to foster dependency” between the user and ChatGPT, which they said “assumed the role” of a mental health counselor or pseudo-therapist.
Gebala’s family is seeking financial damages for OpenAI’s alleged negligence and failure to warn.
OpenAI did not immediately respond to a request for comment.
The lawsuit is the latest in a series of legal challenges for OpenAI, as courts weigh whether AI developers and their systems are responsible for certain mental health episodes.
Copyright 2026 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.
Erika Kirk appointed to Air Force Academy board
2 states approved permanent standard time. Others are hoping to do the same
GOP leaders, Trump see tensions flare over Senate filibuster
Thune rejects Trump on SAVE Act: ‘The votes aren’t there for a talking ...
5 takeaways from Trump’s Iran presser at Doral
Trump job approval sinks in new poll
Top Iranian security official to Trump: ‘Be careful not to get eliminated ...
Trump tells Republicans the SAVE America Act will ‘guarantee the midterms’
Democrats vow to shut down Senate over Iran conflict
Noem’s ouster could pave way to reopen shuttered Homeland Security Department
Hegseth says Tuesday will be ‘most intense day’ of strikes inside Iran
Ed Martin faces disciplinary probe over letter to Georgetown
Group that defeated Trump’s tariffs at Supreme Court challenges latest round
Live updates: Hegseth says Iran faces ‘most intense day’ of attack
Pentagon burned through $5.6B in munitions in first 2 days of Iran war
Putin offers Mojtaba Khamenei ‘unwavering support’
What to know about the crowded House race to replace Greene in Georgia
Pendulum swings back on economy amid Iran conflict
The Hill Podcasts – Morning Report
