menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

Why Are Clients Turning to AI for Mental Health?

16 5
latest

Many individuals using AI for emotional support already have experience with traditional human therapy.

People seek out AI chatbots to access on-demand, pragmatic strategies for emotional regulation and support.

Clinicians who dismiss AI risk alienating clients and creating distance within the therapeutic relationship.

In my last AI for Interactive Journaling workshop, I ran a completely unscientific poll that really surprised me. Of the 55 people who reported therapy experience, 49% are turning to AI for mental health support. Not only that, but they are finding working with AI helpful.

These are people who know what therapy looks and feels like.

And they’re still using and benefiting from AI.

It’s been easy to assume that most people using AI for emotional support were people who didn’t have access to therapy, couldn’t afford it, or didn’t really understand the process. However, Rousmaniere et al. (2025) reported 87% of respondents who used AI for mental health support also had experience with human therapy. When Rousmaniere asked respondents to compare their experiences with AI versus human therapy, nearly 75% said their experience was equal to or better with AI.

I’m not arguing AI is better than therapy. I don’t believe that.

But if people who have experience with a human therapist are turning to AI either in place of or as a supplement to therapy, we need to take a closer look at what is happening to possibly explain this.

Some therapists have the belief that people should “know better” than to use a computer over human therapy. Yet, it seems people who understand therapy are quite comfortable chatting with an algorithm. Why?

The Need for Immediate, Actionable Troubleshooting

What if this data is telling us that some people don't need traditional therapy as their primary source of support? What if this suggests that what people actually need is faster, more direct, more immediately actionable help that AI can easily provide?

AI excels at breaking down a problem quickly, getting to the heart of the matter, offering direct responses, a therapeutic approach many master's programs have not traditionally emphasized, favoring open-ended exploration instead.

Therapists may argue this isn't really therapy, that it is really just “therapy-lite.” Someone using AI for emotional support is most likely not getting the same intensive benefits of therapy.

But this misses the point. AI isn't "therapy-lite"; it's a different tool for a different job. If chatting with AI provides someone relief from an anxiety spiral at 3 a.m., or helps them map out exactly why they feel stuck at work and what to do about it at 2 p.m. on a Tuesday, then maybe they don’t need the full, intensive benefits of traditional therapy in that moment.

The Risk of Creating Distance

The uncomfortable truth is that therapists who haven't tried AI themselves will struggle to talk about it meaningfully with clients who are already using it.

If AI becomes a taboo subject that cannot be brought up safely in session, clients will simply hide it. By judging the technology rather than getting curious about it, we will have created exactly the kind of distance and shame that therapy is supposed to dissolve.

Aren't we supposed to meet our clients where they're at? Today, a lot of them are finding relief through a screen.

This isn't just about AI and technology. It's about making therapy useful for the people we most want to be helping.

Rousmaniere, T., Zhang, Y., Li, X., & Shah, S. (2025). Large language models as mental health resources: Patterns of use in the United States. Practice Innovations. https://doi.org/10.1037/pri0000292


© Psychology Today