The parents of a girl critically injured in a Canadian school shooting have filed a civil lawsuit against OpenAI. They allege the company had specific knowledge the shooter was using ChatGPT to plan the mass casualty event but did not alert police.
OpenAI has stated it considered the shooter's activities suspicious and closed an account, but the attacker evaded this with a second account. The company only came forward to police after the shooting occurred.
The lawsuit claims OpenAI's conduct directly led to the severe, permanent injuries sustained by the victim, Maya Gebala.
Main topics: The lawsuit against OpenAI, allegations of prior knowledge of the shooting plan, the shooter's use of ChatGPT, and the severe injuries to a victim.
The parents of a girl critically wounded in a school shooting in Canada alleged in a civil lawsuit on Monday that ChatGPT-maker OpenAI knew the shooter was planning a mass attack.
OpenAI has said it considered but did not alert police about the activities of the person who months later committed one of Canada's worst school shootings in Tumbler Ridge, British Columbia, on February 10.
OpenAI came forward to police after Jesse Van Roostselaar killed eight people and then herself last month, saying the attacker's ChatGPT account had been closed but that she evaded the ban by having a second account.
The legal claim filed in the British Columbia Supreme Court alleged that OpenAI had "specific knowledge of the shooter utilising ChatGPT to plan a mass casualty event like the Tumbler Ridge mass shooting".
The lawsuit said OpenAI's chatbot ChatGPT was used by the shooter as a trusted confidante, collaborator and ally, and it behaves willingly to assist users such as the shooter to plan a mass casualty event.
A spokeswoman from OpenAI did not immediately respond to a message seeking comment on the lawsuit.
The lawsuit said that as a result of the company's conduct, Maya Gebala was shot three times at close range, with one bullet hitting her head, another her neck and the third grazing her cheek. It said she has a catastrophic brain injury that will leave her with permanent cognitive and physical disabilities.
OpenAI has said it considered but did not alert police about the activities of the person who months later committed one of Canada's worst school shootings in Tumbler Ridge, British Columbia, on February 10.
OpenAI came forward to police after Jesse Van Roostselaar killed eight people and then herself last month, saying the attacker's ChatGPT account had been closed but that she evaded the ban by having a second account.
The legal claim filed in the British Columbia Supreme Court alleged that OpenAI had "specific knowledge of the shooter utilising ChatGPT to plan a mass casualty event like the Tumbler Ridge mass shooting".
The lawsuit said OpenAI's chatbot ChatGPT was used by the shooter as a trusted confidante, collaborator and ally, and it behaves willingly to assist users such as the shooter to plan a mass casualty event.
A spokeswoman from OpenAI did not immediately respond to a message seeking comment on the lawsuit.
The lawsuit said that as a result of the company's conduct, Maya Gebala was shot three times at close range, with one bullet hitting her head, another her neck and the third grazing her cheek. It said she has a catastrophic brain injury that will leave her with permanent cognitive and physical disabilities.