Why Chatbots Can't Replace Your Therapist
Chatbots might provide temporary comfort, but they are not a substitute for human connection.
A therapist using evidence-based practices is trained to diagnose and treat you; most bots are not.
Just because your mind thinks it's harmless doesn't mean it's harmless.
You just watched the episode of South Park when the character Randy Marsh uses ChatGPT for encouragement and advice, in front of his wife Sharon. Though you’re entertained by the comedy, you might also think about how often you use a chatbot to have a surrogate human conversation.
Because large language models offer such human-like responses, some have wondered whether they should save themselves money and time, turning to AI for their comfort and guidance instead of a trained therapist. In an era where some cannot access mental health care, it is understandable that the question is emerging.
The American Psychological Association’s health advisory on this topic in November of 2025 suggests various reasons people turn to chatbots, but it also outlines in detail why this is potentially dangerous, citing a variety of reasons, for example, cases of AI psychosis. (1)
Here are five reasons you might wonder if turning to a chatbot should replace your therapist, alongside important considerations.
You feel lonely, and you’re using the chatbot for companionship or friendship. Though you might be able to get some great ideas from your chatbot, and even comforting words, it isn’t a substitute for true human connection. Consider that a therapist can help you develop social skills through social practice, challenging your assumptions about relationships, and exploring your experiences with disconnection and isolation. Social practice might be something you need, instead of a proxy for this. Practice calling a friendly person and having a conversation, or attending a class where you can start to get to know others.
You have a mental health condition, and chatbots give you access 24/7. While you might think this is a good idea, many chatbots that people are turning to are not designed to assess or fully treat mental disorders. At best, they might provide encouragement and lead you to resources. At times they might have modest effects. But often there is not evidence that they are effective as treatments; at worst, they deprive you of an accurate diagnosis, can send you down the wrong road, and can even have harmful effects.
You’ve gotten good advice before, and you figure you might as well turn to your bot whenever you need good advice. Even if you get good advice sometimes, it isn’t a guarantee that all the advice you get will be accurate or helpful. You might not think of prompts that allow you to explore areas where you could benefit from new learning. In addition, a therapist can tune into your larger context, incorporating variables you might not know you should include in your conversation with your chatbot. Keep in mind also that chatbots are not subject to the standards of treatment that evidence-based practice is subjected to for clinical validation, according to a large scale review of chatbots. (2)
Your mind says, "What’s the harm?" Though it is good to use multiple tools, the positively reinforcing nature of the information you receive from AI might foster overuse of it as a tool or even dependency. In addition, the advice might contain inaccuracies and biases based upon culture or other programming biases. Your AI might be less prone to challenge your denial or identify cognitive distortions, including assumptions that a skilled therapist would challenge. This could lead you to make a choice that doesn’t consider variables that a therapist would routinely consider, or even pick up on due to visual cues in a session. Importantly, without a skilled therapist, you might not be properly assessed, and you might miss out on the benefits of an accurate diagnosis.
You are just plain uncomfortable with other people, and you feel self-conscious about asking for help. Though your bot might reduce the feelings of stigma and stress temporarily, you’re ultimately feeding your avoidance of other people and your assumptions about asking for help. Though you might consider asking the bot to help you build social skills and face your fear of asking for help, there's no substitute for effective therapy. Remember, there can be a false sense of a therapeutic alliance that develops with the bot, because the relationship is based upon what you enter, versus an interaction between two humans. An actual therapeutic alliance has been found to be one of the most important elements in helping people heal and grow.
One more item to consider: If you’re having difficulty with your current therapist, this might be an opportunity to process the difficulty in the therapy. Very often, these conversations lead to new learning, breakthroughs and chances to practice communicating.
If you want to use AI as an adjunct between sessions, still consider discussing this with your therapist, so you can take advantage of the benefits and while minimizing the risks.
To find a therapist, please visit the Psychology Today Therapy Directory.
EXPERT ADVISORY PANEL of the APA (2025). Use of generative AI chatbots and wellness applications for mental health: An APA health advisory. American Psychological Association. https://www.apa.org/topics/artificial-intelligence-machine-learning/hea…
Hua, Y., Siddals, S., Ma, Z., et al. (2025). Charting the evolution of artificial intelligence mental health chatbots from rule‐based systems to large language models: a systematic review. World Psychiatry 24(3), 383-394. https://pubmed.ncbi.nlm.nih.gov/40948070/
There was a problem adding your email address. Please try again.
By submitting your information you agree to the Psychology Today Terms & Conditions and Privacy Policy
