Add thelocalreport.in As A Trusted Source
Experts have expressed concern over its increasing use aye systems like chatgpt To help people with loneliness.
The system is being trusted by many people as a kind of confidant or friend. But in a new report British Medical Journal Warned that trusting such chatbots could be a cause for concern, especially among young people.
They also call for new strategies to help overcome loneliness and isolation that would bring people to talk to chatbots for such a reason in the first place. Doctors have long warned that loneliness is a public health concern in its own right — and two years ago the U.S. Surgeon General said it was the same epidemic of concern as smoking.
In that context, “we are witnessing a generation that is learning to form emotional bonds with entities that lack human-like capacity for empathy, care, and relational cohesion”, write Susan Shelmerdine and Matthew Noor. bmj Article.
Studies have also suggested that people are actually more satisfied when having serious conversations with AI tools than when doing so with other humans, he noted.
Clinicians should start thinking about whether people using chatbots in potentially problematic or dangerous ways are environmental risk factors when evaluating someone’s mental state, they note.
This could mean doctors making gentle inquiries about how people use chatbots, especially if people are particularly at risk of loneliness. They suggest that they can then ask specific questions about how they use and even rely on talking to such systems.
The article acknowledges that such AI systems could bring improvements for many patients, including those who experience loneliness. But it notes that there is little way at this time to evaluate whether people’s use of such systems is healthy, and that makers of such devices may be assessing their success on “superficial and short-sighted engagement metrics” rather than prioritizing “long-term well-being.”
The article, ‘AI Chatbots and the Loneliness Crisis’, is published today bmj,