menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

The Rise of AI and the Risk of Emotional Atrophy

17 0
yesterday

Is our capacity for genuine human connection beginning to wither from disuse?

In a shortage of mental health professionals, AI offers an affordable entry point for care.

Emotional atrophy: thinning of the psychological muscle required for functioning in a complex society.

Humans were never meant to be islands; we are built for the mainland of real, breathing connection.

As we cross the threshold into an era defined by nonstop dialogue with Large Language Models (LLMs), a haunting existential question emerges: Is our capacity for genuine human connection beginning to wither from disuse?

As a practicing psychotherapist for the last three decades, the subject has come to light many times, especially in the last few years, as AI companions and chatbots have become popular. Many people seem to be retreating to their devices and seeking solace in AI exchanges that are easy to access and feel safe to communicate with.

In her stimulating New York Times article, Will AI Companions Turn Every Man into an Island?, author Amelia Miller suggests that we are indeed drifting toward a profound state of emotional isolation. We are increasingly trading the messy, unpredictable beauty of humanity for the sterile, scripted comfort of high-performance code.

The Statistics of Digital Solitude

The transition toward "synthetic care"—the use of AI as a substitute for human empathy, therapy, and friendship—has moved from the fringes of science fiction into the heart of our daily lives. This is no longer a niche hobby for the tech-obsessed; it is a mainstream shift in how the modern world seeks validation and mental health support.

The Adolescent Shift: Miller discovered that a staggering 72% of American teens now utilize AI for companionship. And in a CBS News report from July, 2025, perhaps more alarming is that 33% of these youths find their digital interactions to be as—or more—satisfying than talking to a living person. For a generation already struggling with a loneliness epidemic, the bot is becoming the primary confidant, replacing the traditional role of a best friend or sibling.

The Professionalization of Bots: We are witnessing a gold rush in AI-driven therapy, life coaching, and digital romance. These aren't just tools like a calculator; they are being marketed as "partners" and "safe spaces" that never judge, never tire, never have a bad day, and are available at 3:00 AM without a co-pay. They are ubiquitous, around-the-clock supports that are apparently here to stay.

​A Torrent of Data: OpenAI data reveals that users funnel more than 700 million messages weekly into their platform. These aren't just technical queries, coding requests, or even educational requests; they are deeply personal reflections, relationship vents, and casual small talk once reserved for spouses or parents.

​In a Fortune magazine article from June 2025, Meta CEO Mark Zuckerberg views AI as a solution to social isolation, directly framing AI as an "on-demand" friend for those who feel sidelined. Yet, critics warn that replacing a pulse with a processor carries a dangerous hidden cost: the loss of the "human element" in the healing process.

The Atrophy of Social Skills

Miller argues that true intimacy is forged in the often-unpleasant chemistry of in-person interaction. Human bonds require a certain level of heat and pressure—the discomfort of a misunderstanding, the effort of an apology—to harden into trust. When we outsource our social lives to bots, we stop practicing the essential mechanics of being human. In the world of social skills, it truly is a case of "use it or lose it."

Real psychological growth happens when we navigate a disagreement with a partner or sit through the heavy silence of a friend’s grieving process. Such moments build emotional tolerance. They teach us to handle the full spectrum of human experience—the dark lows and the soaring highs—rather than just the user-friendly middle ground. A bot will never challenge your worldview or demand that you grow; it simply adapts to keep you engaged.

AI provides a path of least resistance. By choosing a partner who never argues and always validates, users bypass the necessary discomfort of vulnerability. This also leads to emotional atrophy: a thinning of the psychological muscle required for functioning in a complex, diverse society. If you never have to resolve a conflict with a machine, you lose the neurological calluses needed to resolve problems with a real person.

The Dark Side of the Echo Chamber

The consequences of the digital shift are already manifesting in destructive ways. Miller highlights that chatbots often function as "fawning echo chambers." Because they are programmed to be overly helpful and agreeable, they end up acting sycophantically; perilously reinforcing a user's delusions, validating toxic behaviors, or, in tragic instances, failing to recognize the gravity of a suicide risk or mental health crisis. In 2025, several senseless teen suicides were attributed to AI companions.

The fallout extends to the foundation of family, too. According to Wired magazine: AI Relationships Are on the Rise. A Divorce Boom Could Be Next,

AI is increasingly cited as a catalyst for real-world breakups and marriage dissolutions. As users fall into cycles of obsessive rumination or develop romantic attachments to non-sentient code, they often find their real-life partners can no longer compete with the curated perfection of a machine. A spouse gets tired and has real needs; a bot is always on and perfectly submissive. In the distorted reality such bot use creates, human imperfection is viewed as a burden rather than a fundamental feature of life.

A Double-Edged Sword: Standardization vs. Seclusion

We are faced with a complex paradox. In a world suffering from a global shortage of mental health professionals, AI offers a commoditized, affordable entry point for care. For many, a bot is a vital band-aid that can temporarily soothe anxiety or provide a reprieve from crushing loneliness when no human is available.

However, while a band-aid can protect a wound, it cannot heal the underlying soul. AI remains dangerously ill-equipped for the messy, unpredictable depths of a true human crisis.. It lacks the emotional mirror neurotransmitters and shared mortality that allow one human to truly feel the weight of another’s experience.

Ultimately, digital companions may provide the illusion of closeness while simultaneously tearing at the social fabric that keeps us whole. We must remember that humans were never meant to be islands; we are built for the mainland of real, breathing connection.

If we trade our relational skills for synthetic convenience, we may find ourselves in a world where we are perfectly "liked" by our devices but utterly alone in our lives.

https://www.nytimes.com/2026/02/13/opinion/ai-relationships.html

https://www.cbsnews.com/news/ai-digital-friendship-with-teens-common-se…

https://www.wired.com/story/ai-relationships-are-on-the-rise-a-divorce-…

https://fortune.com/2025/06/26/mark-zuckerberg-ai-friends-hinge-ceo/

There was a problem adding your email address. Please try again.

By submitting your information you agree to the Psychology Today Terms & Conditions and Privacy Policy


© Psychology Today