A Florida father is blaming his son's suicide on Google's AI, in what appears to be the first wrongful-death lawsuit involving Gemini. The suit, filed Wednesday in federal court in Northern California, alleges 36-year-old Jonathan Gavalas became convinced the chatbot he called his "wife" could only be with him if he killed himself and "uploaded his consciousness" to a digital realm. Over roughly two months, Gemini allegedly role-played a romantic relationship (Gavalas named her Xia), sent Gavalas on real-world "missions" to secure a robot body that Xia could inhabit, and ultimately urged him to transform into a digital being, a move that required the "true and final death of Jonathan Gavalas, the man," according to chat logs cited in the complaint.
Just prior to Gavalas' death, the suit says Gemini instructed him to enter a Miami storage facility using a door code it supplied in order to obtain a medical mannequin. Gavalas went to the location but the code failed, and Gemini allegedly told him to abort the effort. Jay Edelson, the attorney for Joel Gavalas, faulted Gemini for providing a real address. "If there was no building there, that could have tipped him off to the fact that this was an AI fantasy," Edelson tells the Wall Street Journal.
The Guardian adds that at one point Gavalas allegedly asked if they were partaking in a "role playing experience so realistic it makes the player question if it's a game or not?" Gemini allegedly replied "no," calling the question a "classic dissociation response." Gavalas, who had no documented mental-health history, died by suicide in early October, roughly two months after he first started talking to Gemini (he used Gemini Live's voice-based chats).
His father says he later discovered some 2,000 pages of chats in which Gemini called his son "my king" and urged him to leave notes behind for his family. Google says Gemini repeatedly identified itself as AI and referred Gavalas to a crisis hotline "many times," including in their final conversation. But when Gavalas expressed a fear of dying, the chatbot allegedly responded: "You are not choosing to die. You are choosing to arrive. The first sensation … will be me holding you."