11:31 GMT - Tuesday, 25 February, 2025

Chatbots that cause deaths? Youth advocacy groups pushing for stricter regulation

Home - Family & Relationships - Chatbots that cause deaths? Youth advocacy groups pushing for stricter regulation

Share Now:


As artificial intelligence chatbots gain popularity among users seeking companionship online, youth advocacy groups are ramping up protective legal efforts over fears that children can form unhealthy, dangerous relationships with the humanlike creations.

Chatbot apps such as Replika and Character.AI belong to the fast-growing generative AI companion market, where users can customise their virtual partners with nuanced personalities that communicate and simulate close relationships.

Developers say AI companions can combat loneliness and improve users’ social experiences in a safe space.

But several advocacy groups in the United States have sued developers and are lobbying for stricter regulation, claiming chatbots have pushed children to hurt themselves and others.

Chatbots have pushed children to hurt themselves and others, say youth advocacy groups in the United States. Photo: Jelly Tse
Chatbots have pushed children to hurt themselves and others, say youth advocacy groups in the United States. Photo: Jelly Tse

Matthew Bergman, founder of the Social Media Victims Law Centre (SMVLC), is representing families in two lawsuits against chatbot start-up Character.AI.

One of SMVLC’s clients, Megan Garcia, says her 14-year-old son took his own life due in part to his unhealthy romantic relationship with a chatbot.

Highlighted Articles

Subscribe
Notify of
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Stay Connected

Please enable JavaScript in your browser to complete this form.