Nearly a year after the deadly shooting at Florida State University, in which two people were killed and several others injured, the family of one of the victims is preparing to sue ChatGPT’s parent company, OpenAI, alleging the AI chatbot may have played a role in enabling the attackLawyers representing the family of Robert Morales claim the accused gunman was in “constant communication” with ChatGPT in the lead-up to the April 17, 2025 incident.The legal team has argued that the chatbot may have provided guidance that contributed to the planning and execution of the shooting, calling it a factor in what they described as a “senseless and heinous crime.”Robert Morales, 57, a university dining programme manager and former high school football coach, was among two people killed in the attack. Tiru Chabba, a 45-year-old father, also lost his life, while six others were injured.Morales’ family remembered him as “a man of quiet brilliance and many gifts.”
Details from chat records raise concerns
Court filings reveal that more than 270 ChatGPT interactions have been listed as evidence. While not all messages are public, available records suggest the accused shooter asked questions ranging from personal distress and self-worth to firearms usage and mass shooting patterns.The chatbot reportedly provided factual information about when the university’s student union is busiest, a time window that coincided with the attack. In another, it allegedly explained how to operate a shotgun shortly before the shooting began.Investigators said that the suspect also searched about prison systems and the fate of mass shooters in the hours leading up to the incident.The lawsuit is expected to argue that OpenAI failed to prevent harmful interactions despite warning signs. The family’s lawyers have indicated they will seek accountability not only from the company but potentially from other institutions as well, including local law enforcement agencies that had prior contact with the suspect.The accused shooter, a student at the time, faces charges including first-degree murder and attempted murder. His trial is currently scheduled for October, though timelines may shift.In recent months, multiple lawsuits have alleged that chatbots encouraged self-harm, fueled delusions, or failed to alert authorities about dangerous user behaviour.
OpenAI responds
In response, OpenAI has said it identified an account linked to the suspect after the incident and shared relevant information with law enforcement. The company also said that ChatGPT is designed to detect intent and respond safely, and that safeguards are continually being improved.“Our hearts go out to everyone affected by this devastating tragedy,” the company said in a statement.