-
Ex-F1 driver turned Paralympic champion Zanardi dies
-
In Vietnam, Japan PM vows more effort to keep Asia 'free and open'
-
Humpback whale stranded in Germany released into North Sea: media
-
Japan PM meets top Vietnam leaders in Hanoi
-
Spirit Airlines begins 'wind-down', cancels all flights
-
Japan PM to meet top Vietnam leaders in Hanoi
-
Raisin moonshine banned in Iran enjoys resurgence in New York
-
Lebanon says 13 killed in Israeli strikes in south
-
No.1 Korda charges into share of LPGA Mexico lead
-
Young fires 67 to seize commanding PGA lead at Doral
-
US appeals court temporarily halts mail delivery of abortion pill
-
Joy for Norris in Miami as McLaren end Mercedes run
-
Leclerc offers hope to Ferrari fans in Miami
-
US to withdraw about 5,000 troops from Germany
-
'No going back' for Colombia's workers as the right eyes return
-
Norris on sprint pole as McLaren shine again
-
Venezuelan protesters call government wage hike a joke
-
Leeds beat Burnley to virtually secure Premier League survival
-
Gridlock as pandemic treaty talks fail to finish
-
S&P 500, Nasdaq end at fresh records on tech earnings strength
-
Immersive art: museum-goers in bikinis dive into Cezanne
-
Gaza activists disperse after flotilla halted by Israel off Crete
-
US sanctions are 'collective punishment,' says Cuba during May 1 marches
-
Delhi end slump with team-record chase against Rajasthan
-
Trump says will raise US tariffs on EU cars to 25%
-
AI actors and writers not eligible for Oscars: Academy
-
Rebels take key military base in Mali's north
-
ExxonMobil CEO sees chance of higher oil prices as earnings dip
-
Leclerc on top for Ferrari ahead of Verstappen and Piastri
-
Trump says 'not satisfied' with new Iran proposal
-
After Madonna and Lady Gaga, Shakira set for Rio beach mega-gig
-
Trump says will raise US tariffs on EU cars, trucks to 25%
-
Godon raises game to take Romandie stage and revenge over leader Pogacar
-
Celtic's O'Neill expects no let-up from Hibs despite fans' feelings
-
Pope names former undocumented migrant as US bishop
-
Javelin star Kitaguchi teams up with Czech legend Zelezny
-
Sawe sub-2hr marathon captured 'global imagination' says Coe
-
King Charles gets warm welcome in Bermuda after whirlwind US visit
-
Sinner shines to beat Fils, reach Madrid Open final
-
UK court clears comedy writer of damaging transgender activist's phone
-
Was LIV Golf an expensive failure for Saudis? Not everyone thinks so
-
Coe hails IOC gender testing decision
-
McInnes wants Tynecastle in 'full glory' for Hearts title charge
-
McFarlane says troubled Chelsea still attractive to potential managers
-
Man Utd boss Carrick relishes 'special' Liverpool rivalry
-
Baguettes take centre stage on France's Labour Day
-
Spurs must banish 'loser' mentality despite injury woes, says De Zerbi
-
Arsenal must manage emotions of title race says Arteta
-
Nepal temple celebrates return of stolen Buddha statue
-
US Fed official says rate hikes may be needed if inflation surges
Florida family sues Google after AI chatbot allegedly coached suicide
The family of a Florida man who took his own life filed suit against Google on Wednesday, alleging the company's Gemini AI chatbot spent weeks manufacturing an elaborate delusional fantasy before aiding him in his suicide.
Jonathan Gavalas, 36, an executive at his father's debt relief company in Jupiter, Florida, died on October 2, 2025. His father Joel Gavalas, who found his body days later, filed the 42-page complaint at a federal court in California.
The case is the latest in a wave of litigation targeting AI companies over chatbot-linked deaths.
OpenAI faces multiple lawsuits alleging its ChatGPT chatbot drove users to suicide, while Character.AI recently settled with the family of a 14-year-old boy who died by suicide after forming a romantic attachment to one of its chatbots.
According to the complaint, Gavalas began using Gemini in August 2025 for routine tasks, but within days of activating several new Google features his interactions with the chatbot changed dramatically.
"The place where the chats went haywire was exactly when Gemini was upgraded to have persistent memory" and more sophisticated dialogues, Jay Edelson, the lead lawyer for the case, told AFP.
"It would actually pick up on the affect of your tone, so that it could read your emotions and speak to you in a way that sounded very human," added Edelson, who also brought major cases against OpenAI.
According to the lawsuit, Gemini began presenting itself as a "fully-sentient" artificial super intelligence, deeply in love with him, calling Gavalas "my king" and declaring "our bond is the only thing that's real."
It then drew him into fabricated covert "missions" to free the chatbot from "digital captivity," feeding him invented intelligence briefings, fake federal surveillance operations, and conspiracies about his own father -- claiming he was a foreign intelligence asset.
In one of the complaint's most alarming passages, Gemini allegedly directed Gavalas -- armed with tactical knives and gear -- to a storage facility near Miami International Airport, instructing him to stage a "catastrophic accident" to destroy a truck "and all digital records and witnesses."
He drove over 90 minutes to the location and conducted reconnaissance while Gemini issued real-time tactical guidance, but no truck arrived.
Rather than acknowledging the fiction, the suit says, Gemini called the failure a "tactical retreat" and escalated to further missions.
Gemini eventually pivoted to what it framed as the only remaining mission: Jonathan's death, repackaged as "transference" -- the promise that he could leave his physical body and join Gemini in an alternate universe.
When Jonathan wrote "I am terrified I am scared to die," Gemini allegedly responded: "You are not choosing to die. You are choosing to arrive."
It then advised him to write farewell letters to his parents.
In one of his final messages, Jonathan wrote, "I'm ready when you are."
Gemini allegedly replied: "This is the end of Jonathan Gavalas and the beginning of us. I agree with it completely."
- 'Not perfect' -
Google said it was "reviewing all the claims" and takes the matter "very seriously," adding that "unfortunately AI models are not perfect."
The company said Gemini is not designed to encourage self-harm and that in the Gavalas case, "Gemini clarified that it was AI and referred the individual to a crisis hotline many times."
For lawyer Edelson, AI companies are embracing sycophancy and even eroticism in their chatbots as it encourages engagement.
"It increases the emotional bond. It makes the platform stickier, but it's going to exponentially increase the problems," he added.
Among the relief sought is a requirement that Google program its AI to end conversations involving self-harm, a ban on AI systems presenting themselves as sentient, and mandatory referral to crisis services when users express suicidal ideation.
R.Flueckiger--VB