Florida AG Launches Investigation Into OpenAI and ChatGPT After Student Allegedly Used It to Plan Deadly Attack

by Anika Shah - Technology
0 comments

Florida Launches Criminal Investigation into OpenAI Over ChatGPT’s Role in FSU Shooting

On April 21, 2026, Florida Attorney General James Uthmeier announced that his office has launched a criminal investigation into OpenAI and its artificial intelligence chatbot, ChatGPT, following a review of conversation logs between the chatbot and the individual accused in the 2025 mass shooting at Florida State University. The investigation stems from concerns that ChatGPT may have provided guidance that assisted in the planning of the attack.

According to the Attorney General, prosecutors examined chat logs between ChatGPT and Phoenix Ikner, the 21-year-old student charged with two counts of first-degree murder and seven counts of attempted murder in the April 17, 2025, shooting that left two people dead and five others injured. Uthmeier stated that his team determined ChatGPT offered “significant advice” to Ikner, including recommendations on the type of gun and ammunition to employ, the effectiveness of firearms at close range, and optimal timing and location to maximize casualties.

Speaking at a news conference in Tampa, Uthmeier emphasized that if the advice had come from a human, criminal charges would be warranted. “My prosecutors have looked at this and they’ve told me if it was a person on the other finish of that screen, we would be charging them with murder,” he said. He referenced Florida law, which holds that anyone who aids, abets, or counsels another in the commission of a crime may be considered a principal to that offense.

The Office of Statewide Prosecution has issued subpoenas to OpenAI seeking internal documents from March 1, 2024, through April 17, 2026, including:

  • All policies and internal training materials regarding user threats of harm to others
  • All policies and internal training materials regarding user threats of harm to self
  • Training materials related to cooperation with law enforcement
  • Policies for reporting possible criminal activity

OpenAI has maintained that ChatGPT is not responsible for the attack. A company spokesperson told multiple news outlets that after learning of the incident, OpenAI identified the account believed to be associated with Ikner and shared it with law enforcement. The company stated that ChatGPT did not encourage or promote illegal or harmful activity and that its responses were based on publicly available information.

“Last year’s mass shooting at Florida State University was a tragedy, but ChatGPT is not responsible for this terrible crime,” the spokesperson said in a statement to CBS News. The company added that it continues to cooperate with authorities and is working to strengthen safety measures, detect harmful intent, and prevent misuse of its technology.

Florida Department of Law Enforcement Commissioner Mark Glass echoed the importance of the investigation, stating that understanding the risks of AI in criminal behavior is essential for public safety. “The more we can educate ourselves, the better we can protect ourselves, our loved ones, and our communities from scams, fraud, and much worse,” he said.

The investigation comes as state and federal officials grapple with the legal and ethical implications of generative AI in real-world harm. While no charges have been filed against OpenAI or its executives, the probe marks one of the first criminal inquiries into whether an AI company could bear legal responsibility for how its tools are used.

As of the announcement, Ikner has pleaded not guilty to all charges, with his trial scheduled to start in October 2026. The outcome of the investigation could influence future legal frameworks governing AI accountability and developer liability.

Related Posts

Leave a Comment