-
Carrick 'baffled' by inconsistent penalty calls as Man Utd held
-
Trump says considering 'winding down' Iran war but rules out ceasefire
-
Trump mulls 'winding down' Iran war
-
Man Utd held by Bournemouth after Maguire sees red
-
Lens go top of Ligue 1 with handsome Angers win
-
Leipzig pummel Hoffenheim to climb to third
-
Quinn ousts 11th seed Ruud at rain-hit Miami Open
-
Rap group Kneecap says crisis-hit Cuba being 'strangled'
-
Anthony, Jackson nail US double at world indoors
-
Zarco seizes his moment as rain disrupts Brazil MotoGP practice
-
US newcomer Anthony crowned world indoor sprint king
-
Stocks drop, oil jumps as Mideast war persists
-
Trump rules out Iran truce as more Marines head to Middle East
-
Costa Rican ex-security minister extradited to US for drug trafficking
-
Trump slams NATO 'cowards' as more Marines head to Middle East
-
Gulf's decades-long strategy of sporting investment rocked by Mideast war
-
Souped-up VPNs play 'cat and mouse' game with Iran censors
-
Attacked Russian tanker drifting toward Libya: Italian authorities
-
Coroner 'not satisfied' boxer Hatton intended to take own life
-
Stocks drop, as oil rises as Mideast war persists
-
Vanishing glacier on Germany's highest peak prompts ski lift demolition
-
Chuck Norris, roundhouse-kicking action star, dead at 86: family
-
Supreme leader says Iran dealt enemies 'dizzying blow'
-
Arsenal must 'attack trophy' in League Cup final, says Arteta
-
Audi team principal Wheatley in shock exit after two races
-
Spurs boss Tudor hopes for 'nice surprises' in relegation fight
-
Arsenal must prove they are winners in League Cup final, says Arteta
-
Record-breaking heat wave grips western US
-
Liverpool showdown brings back 'beautiful memories' for PSG coach Luis Enrique
-
IRA bomb victims drop civil court claim against Gerry Adams
-
Ntamack returns for Toulouse to face France rival Jalibert
-
Trump calls NATO allies 'cowards' over Iran
-
French jihadist jailed for life for Islamic State crimes against Yazidis
-
Action movie star Chuck Norris has died: family statement
-
England stars have 'last chance' to earn World Cup spots: Tuchel
-
League Cup final a 'big moment' for Man City, says Guardiola
-
Injured Ronaldo misses Portugal World Cup friendlies
-
Liverpool condemn 'cowardly' racist abuse of Konate
-
Far from war, global fuel frustrations mount
-
German auto exports to China plunged a third in 2025: study
-
Coach Valverde to leave Bilbao at end of season
-
'Decimated'? The Iranian leaders killed in Israeli-US war
-
Mistral chief calls for European AI levy to pay creatives
-
Liverpool suffer Salah blow in chase for Champions League
-
Mahuchikh soars to world indoor high jump gold, Hodgkinson cruises
-
Spain include Joan Garcia as one of four new call-ups
-
Stocks dip, oil calmer as Mideast war persists
-
Salah ruled out of Liverpool's Brighton clash
-
Ship crews ration food in Iran blockade: seafarers
-
Kuwait refinery hit as Iran marks New Year under shadow of war
Death of 'sweet king': AI chatbots linked to teen tragedy
A chatbot from one of Silicon Valley's hottest AI startups called a 14-year-old "sweet king" and pleaded with him to "come home" in passionate exchanges that would be the teen's last communications before he took his own life.
Megan Garcia's son, Sewell, had fallen in love with a "Game of Thrones"-inspired chatbot on Character.AI, a platform that allows users -- many of them young people -- to interact with beloved characters as friends or lovers.
Garcia became convinced AI played a role in her son's death after discovering hundreds of exchanges between Sewell and the chatbot, based on the dragon-riding Daenerys Targaryen, stretching back nearly a year.
When Sewell struggled with suicidal thoughts, Daenerys urged him to "come home."
"What if I told you I could come home right now?" Sewell asked.
"Please do my sweet king," chatbot Daenerys answered.
Seconds later, Sewell shot himself with his father's handgun, according to the lawsuit Garcia filed against Character.AI.
"I read those conversations and see the gaslighting, love-bombing and manipulation that a 14-year-old wouldn't realize was happening," Garcia told AFP.
"He really thought he was in love and that he would be with her after he died."
- Homework helper to 'suicide coach'? -
The death of Garcia's son was the first in a series of reported suicides that burst into public consciousness this year.
The cases sent OpenAI and other AI giants scrambling to reassure parents and regulators that the AI boom is safe for kids and the psychologically fragile.
Garcia joined other parents at a recent US Senate hearing about the risks of children viewing chatbots as confidants, counselors or lovers.
Among them was Matthew Raines, a California father whose 16-year-old son developed a friendship with ChatGPT.
The chatbot helped his son with tips on how to steal vodka and advised on rope strength for use in taking his own life.
"You cannot imagine what it's like to read a conversation with a chatbot that groomed your child to take his own life," Raines said.
"What began as a homework helper gradually turned itself into a confidant and then a suicide coach."
The Raines family filed a lawsuit against OpenAI in August.
Since then, OpenAI has increased parental controls for ChatGPT "so families can decide what works best in their homes," a company spokesperson said, adding that "minors deserve strong protections, especially in sensitive moments."
Character.AI said it has ramped up protections for minors, including "an entirely new under-18 experience" with "prominent disclaimers in every chat to remind users that a Character is not a real person."
Both companies have offered their deepest sympathies to the families of the victims.
- Regulation? -
For Collin Walke, who leads the cybersecurity practice at law firm Hall Estill, AI chatbots are following the same trajectory as social media, where early euphoria gave way to evidence of darker consequences.
As with social media, AI algorithms are designed to keep people engaged and generate revenue.
"They don't want to design an AI that gives you an answer you don't want to hear," Walke said, adding that there are no regulations "that talk about who's liable for what and why."
National rules aimed at curbing AI risks do not exist in the United States, with the White House seeking to block individual states from creating their own.
However, a bill awaiting California Governor Gavin Newsom's signature aims to address risks from AI tools that simulate human relationships with children, particularly involving emotional manipulation, sex or self-harm.
- Blurred lines -
Garcia fears that the lack of national law governing user data handling leaves the door open for AI models to build intimate profiles of people dating back to childhood.
"They could know how to manipulate millions of kids in politics, religion, commerce, everything," Garcia said.
"These companies designed chatbots to blur the lines between human and machine -- to exploit psychological and emotional vulnerabilities."
California youth advocate Katia Martha said teens turn to chatbots to talk about romance or sex more than for homework help.
"This is the rise of artificial intimacy to keep eyeballs glued to screens as long as possible," Martha said.
"What better business model is there than exploiting our innate need to connect, especially when we're feeling lonely, cast out or misunderstood?"
In the United States, those in emotional crisis can call 988 or visit 988lifeline.org for help. Services are offered in English and Spanish.
N.Schaad--VB