Experts warn AI chatbots worsen loneliness and hinder human relationships
Updated
Updated · CNN · May 9
Experts warn AI chatbots worsen loneliness and hinder human relationships
11 articles · Updated · CNN · May 9
MIT's Sherry Turkle, Princeton researcher Rose Guingrich and George Mason dean Melissa Perry said chatbots provide validation without face-to-face cues, vulnerability or reciprocity.
They warned agreeable AI may reinforce harmful thoughts, reduce tolerance for conflict and train users away from the friction and compromise needed for real-world relationships.
The WHO named loneliness a global health priority in 2023, and US officials called it an epidemic; experts said AI may help only if it guides people toward in-person support.
As AI becomes a 'best friend' for millions, are we losing the essential skills for real human connection?
With new laws now in effect, can AI companions be safely regulated or are they fundamentally harmful?
The $16.5 Billion AI Companion Boom: Mental Health Risks, Regulation, and the Future of Human Relationships
Overview
Since 2022, digital interaction has been transformed by the rapid rise of AI companions like ChatGPT and Character AI, leading to a surge in user engagement and a booming AI app market. As people spend more time with these chatbots, concerns have grown about their impact on mental health, especially among young users. The industry has seen massive growth and investment, but also faces challenges with high operational costs and funding. This rapid adoption has prompted new regulations and debates about safety, ethics, and the long-term effects of AI on human relationships, highlighting the need for balanced innovation and user protection.