The wife of a victim of the Florida State University mass shooting has sued OpenAI, accusing the company of providing “input and assistance” to the alleged shooter. The suit was filed by Vandana Joshi; Tiru Chabba was one of two university employees who died during the April 2025 incident, in which seven others were injured.
According to the lawsuit, the alleged shooter, Phoenix Ikner, “provided input and information during conversations with ChatGPT over several months and particularly in the days leading up to the shooting.” Joshi’s lawyers accused Chaitgpt of later providing assistance to Ikner by identifying guns used in the shooting, teaching him how to use firearms, and preparing for the shooting. According to chat logs between Ikner and ChatGPT cited in the lawsuit, the chatbot also suggested that involving children in a mass shooting event would attract more attention and make national news. The lawsuit accuses OpenAI of negligence, battery and wrongful death, and also demands a jury trial.
In response, OpenAI spokesperson Drew Pusateri told Engadget that the company is still cooperating with authorities and continuously working to improve security measures. They said that “In this case, ChatGPT provided factual answers to questions with information that could be found in widely public sources on the Internet, and it did not encourage or promote illegal or harmful activity.”
“Last year’s mass shooting at Florida State University was a tragedy, but ChatGPAT is not responsible for this terrible crime,” Pusateri told Engadget in a statement. “After learning of the incident, we identified an account believed to be associated with the suspect and proactively shared this information with law enforcement.”
Florida Attorney General James Uthmeyer also recently launched a criminal investigation into OpenAI, over the belief that its chatbot’s role in the FSU shooting made the company a principal to a crime under state law.
<a href