Whatever Your Chatbot Is Saying, It Isn’t Therapy
Whatever Your Chatbot Is Saying, It Isn’t Therapy
By Divya Saini and Natasha Bailen
Dr. Saini is a psychiatrist and Dr. Bailen is a psychologist at Massachusetts General Hospital.
As the use of large language models like ChatGPT, Claude and Gemini has surged, we’ve heard about chatbots strengthening delusions through flattery and amplifying people’s worst thoughts, in some cases pushing them toward suicide. Much more common, and still problematic, is A.I. chatbots’ comforting, reassuring and validating users seeking to allay fears and anxieties. Someone worried about a health symptom might ask the same question repeatedly and receive calm, plausible answers each time, briefly relieving anxiety but reinforcing the urge to seek reassurance again. Over time, this can leave people feeling........
