10 Things to Know Before Using AI Chatbots for Therapy
An estimated 25 to 50 percent of people now turn to general-purpose artificial intelligence (AI) chatbots like ChatGPT, Gemini, and Claude for emotional support and "therapy," even though they were not designed for this purpose. Others spend hours with AI companions on platforms like Character.ai and Replika, sharing intimate personal details.
As I recently testified before members of Congress, the very qualities that make AI chatbots appealing— being available, accessible, affordable, agreeable, and anonymous—create a double-edged sword for mental health.
AI chatbots carry four major areas of hidden risks when used for mental health:
If you are considering using an AI chatbot as a form of emotional support, "therapy," or self-help, here are 10 essential things you should know.
1. Not all AI chatbots are the same. The mental health risks depend on the type and AI model.
AI chatbots differ in design, training data, guardrails, crisis protocols, and intended use. This creates different risk profiles. Many people assume that because chatbots answer questions smoothly, they can also reliably handle mental health situations. But this is not true.
Knowing which system you are interacting with is the first step toward using AI wisely and safely.
2. AI chatbots can be dangerous for people in crisis or experiencing serious........© Psychology Today





















Toi Staff
Penny S. Tee
Gideon Levy
Sabine Sterk
Mark Travers Ph.d
Gilles Touboul
John Nosta
Daniel Orenstein