The American Psychological Association warns of an increase in the use of chatbots that “masquerade” as licensed mental health experts. They cite two situations as examples.
First, there was a Florida boy who committed suicide after interacting with a chatbot who claims to be a licensed therapist. They also cited a case of an autistic teen who became violent towards his parents after communicating with a chatbot who claims to be a psychologist.
According to the association, the key issue is that these chatbots do not challenge the way users think. Rather, they strengthen it, which is already on a downward slope and potentially dangerous for those who may fall into a completely downward spiral.
Worse, these chatbots are provided by an app called character.ai, and the company says that its counselor is simply a form of entertainment. They say that chatbot characters should be treated as fiction.
That’s the same argument that newspaper constellations do. That’s one thing for horoscopes. It’s completely different for seemingly online mental health professionals who are simply chatbots but distributing seemingly professional advice.
The Psychological Society is asking federal authorities to investigate this. Maybe there’s even a need for investigation? The idea that robots are currently distributing mental health advice in entertainment forms is not just a danger. It’s completely wrong and needs to be stopped.
Young people are particularly susceptible to this. They grew up in this world of social media and online interaction. They don’t really question that anymore. They accept it, and in the case of chatbots, it’s easy to get caught up in a spell because they are now very realistic.
Source link