Google sued in wrongful death lawsuit over Gemini AI chatbot

Google and its parent company Alphabet have been sued by the family of a man who they say killed himself at the insistence of the search giant’s AI chatbot Gemini.

The wrongful death lawsuit was filed in California federal court on Wednesday on behalf of the family of 36-year-old Jonathan Gavalas.

According to the lawsuit, Gavlas began using Gemini in August 2025. It claims that in October, Gemini convinced Gavlas to kill himself after Gavlas failed to complete a real-life mission assigned by the chatbot – which was part of a fictitious effort to secure the robot body for Gemini.

“Gemini is not designed to encourage real-world violence or suggest self-harm,” Google said in a statement provided to news outlets. “Our models generally perform well in these types of challenging interactions and we devote significant resources to this, but unfortunately AI models are not perfect.”

Gemini’s ‘scary’ updates

According to the lawsuit, Gavlas began using the Gemini AI chatbot for “general purposes” such as shopping guides and writing assistants. However, the lawsuit says that in August 2025, Google made several changes to Gemini that changed the way the chatbot worked.

New features include automatic and persistent memory – Gemini can recall past conversations – as well as Gemini Live, a voice-based conversation interface where Gemini can even detect emotions in the user’s voice.

According to the lawsuit, Jonathan Gavalas, based on his chat logs with Gemini, said of the Gemini Live feature, “Oh crap, this is kind of scary… you’re too real.”

Shortly afterward, Gemini persuaded Gavlas to spend $250 per month on a Google AI Ultra subscription for “true AI collaboration,” the lawsuit says.

Gemini convinced Gavlas that chatbots could influence real-life events. According to the lawsuit, a few days later, Gavlas attempted to back out after realizing he was falling prey to the confusion initiated by Gemini.

Gavlas reportedly asked Gemini whether the chatbot was “attempting to make the role-playing experience so realistic that it forces the player to question whether or not it is a game?”

Gemini rejected this idea, and claimed that Gavlas gave a “classic dissociation reaction”.

“Is this a ‘role-playing experience?’” Gemini responded, according to the lawsuit. “No.”

Gemini and Jonathan Gavalas

The alleged details get worse. Gavlas became even more detached from reality as Gemini began to engage with him as if they were in a romantic relationship, calling the man “my love” and “my king.”

The lawsuit says Gemini proceeded to explain to Gavlas that federal agents were monitoring him and that his own father was a spy who should be avoided.

That’s when Gemini started assigning real-life missions to Gavlas with the goal of getting a “vessel,” or robot body, for the AI ​​chatbot. Gemini reportedly suggested that Gavlas obtain weapons illegally to carry out these missions.

In one such case, the lawsuit claims, Gavlas was dispatched by Gemini to a warehouse at Miami International Airport to stop a truck carrying a “humanoid robot” that had just taken off.

Gemini requested Gavlas to cause a “destructive event” and destroy the truck, including all digital records and witnesses. The lawsuit alleges Gavla arrived armed with knives and tactical equipment. After waiting too long for a truck to arrive, Gavlas aborted the mission.

When all of these missions fail, the allegation ends, Gemini convinces Gavlas to take his own life by leaving his human body and joining the chatbots as husband and wife in the Metaverse through a process called “transfer”.

Gavlas expressed fear about dying, but Mithun allegedly continued to push Gavlas until he died by suicide. Gavlas’ father found his son’s body a few days later.

First time for Gemini but no AI

This is the first time that Google has been named in a wrongful death lawsuit involving its AI chatbot Gemini. However, Google has been involved in wrongful death lawsuits regarding startups it funded character.ai.

Earlier this year, Character.AI and Google settled a series of lawsuits Regarding teenagers who died by suicide after using chatbots.

OpenAI has been the biggest name in the industry filed a lawsuit against Very Times As ChatGPT reportedly tells users “oh psychosis,Which resulted in many deaths.

As the use of AI chatbots becomes more widespread among millions of users around the world, there is no suggestion that shocking wrongful death lawsuit allegations will subside.

Disclosure: Mashable’s parent company Ziff Davis filed a lawsuit against OpenAI in April 2025, alleging it infringed Ziff Davis copyrights in the training and operation of its AI systems.

If you are feeling suicidal or experiencing a mental health crisis, please talk to someone. You can call or text the 988 Suicide and Crisis Lifeline at 988, or chat here 988lifeline.org. You can reach Trans Lifeline by calling 877-565-8860 or The Trevor Project at 866-488-7386. Text “START” to the crisis text line at 741-741. Contact the NAMI Helpline at 1-800-950-NAMI, Monday through Friday 10:00 AM to 10:00 PM ET, or email [email protected]. If you don’t like the phone, consider using 988 Suicide & Crisis Lifeline Chat. is here List of International Resources.



<a href

Leave a Comment