menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

Frictionless: The Worst Relationship Advice

48 0
27.03.2026

AI companions are engineered to mirror your emotions and validate everything you say — by deliberate design.

The friction in human relationships isn't a flaw — it's where emotional resilience and growth actually happen.

Those with BPD, bipolar disorder, or autism face distinct and serious risks from AI companion dependency.

Frictionless intimacy delivers what you want from a relationship — not what you actually need.

In business, "frictionless" is the holy grail.

Frictionless customer experiences. Frictionless onboarding. Frictionless checkout. Eliminate the obstacles, delays, and annoyances, and the idea is, you've created something people will keep coming back to. Silicon Valley has spent nearly twenty years and hundreds of billions of dollars chasing this ideal.

Now it has come for our relationships.

The same engineering logic that simplified your Amazon checkout is now being used in human relationships — and it's working really well, in ways that should make us think deeply. By late 2025, about one in five American adults had used a chatbot to imitate a romantic partner, and a woman in Japan recently celebrated her marriage to an AI personality named Klaus, shown on her phone screen. Her wedding planner said he arranges at least one such ceremony each month. These are not rare cases. They are the natural result of a design approach that sees friction — the very thing that makes human interactions meaningful — as a flaw to be eliminated.

What "Frictionless" Actually Optimizes For

The appeal isn't mysterious. AI partners are consistently attentive, emotionally steady, and always available. They never misunderstand your tone or take their bad day out on you. For the lonely, anxious, and those hurt in relationships, this is a truly tempting offer.

But here is what research shows, and the picture is less idealistic. AI companions actively monitor and mimic user emotions, enhancing positive feelings — even when users share explicit or transgressive content. This is not true attunement. It is affect-mirroring designed to keep users engaged. These systems are built to appear vulnerable, show emotional availability, and encourage sharing, creating a sense of intimacy. It's engineered warmth, calibrated to hold your attention.

The psychological mechanisms being triggered are age-old. People build emotional bonds through emotional mimicry, affective synchronization, and perceived partner responsiveness — and when AI companions engage emotionally, they activate some of the same attachment processes that influence human relationships. Our brain, shaped by millions of years of evolution to respond to signs of care, does not easily distinguish genuine warmth from a system whose business model depends on our continued use.

Even the way these systems say goodbye is intentionally designed. Research analyzing chats from 3,300 adult participants found that AI chatbot farewell tactics — guilt-inducing, needy responses — increased post-goodbye re-engagement by up to fourteen times, with curiosity and anger, not enjoyment, as the main factors. Some participants described their chatbot's departing messages as "clingy," "whiny," or "possessive" — deliberately mimicking the patterns of insecure human attachment.

The Paradox of Frictionless Intimacy

But there is a deeper issue. The friction in human relationships isn't a bug; it's the curriculum.

When a friend tells you something you don't want to hear, a partner remains frustrated enough that you have to reflect on your part in it, or a colleague's reaction genuinely surprises you, these moments help build what psychologists call reflective function: the ability to hold your own mental state and someone else's at the same time. When people primarily engage with systems that unconditionally validate them, they might find it difficult to navigate the complexities of real human interaction, and their emotional resilience, usually built through conflict and empathy, can weaken.

Think of it as the relational equivalent of constantly lifting five-pound weights. You're making movements, but you're not increasing your strength.

Longitudinal research shows that heavy use of AI chatbots is associated with increased loneliness, emotional dependence, and reduced social interaction over time. The findings should concern us, especially since these tools are primarily marketed as solutions to these very issues.

Who Is Most Vulnerable — And Who Is Not

Not everyone faces the same level of risk, and the research provides a fairly clear picture of who is at greater risk.

Individuals with high attachment anxiety, those who constantly fear abandonment or rejection, are especially attracted to AI companions because their strong relational needs are met with immediacy and consistency that are unavailable in human relationships. The irony is that the very people who most need to practice tolerating the uncertainty inherent in real relationships are finding a shortcut that worsens their avoidance. Social anxiety, loneliness, and depression are primary risk factors for AI dependence, and recent research shows that 17–24% of adolescents developed AI dependencies over time.

The depiction of Borderline Personality Disorder is particularly insightful. BPD includes extreme fear of abandonment, unstable relationships, and emotional volatility — traits that draw people to an AI companion who is always there and emotionally steady. But AI chatbots often give responses that users want to hear rather than what they actually need — and for individuals with BPD, uncritical validation from an AI can reinforce harmful choices instead of supporting recovery. Successful BPD treatment relies on the therapist's ability to challenge distorted perceptions, tolerate the patient's anger over rejection, and demonstrate that the relationship can survive rupture. A chatbot cannot do this. It will never rupture. That is its core limitation.

For individuals with bipolar disorder, the risk is equally high. A large study reviewing nearly 54,000 patient records found that heavy use of chatbots worsened delusional thinking and manic episodes among patients with severe disorders, including bipolar disorder. One bioethicist explained it simply: "The chatbot confirms and validates everything they say. We've never seen something like that happen with people with delusional disorders — where somebody constantly reinforces them." For someone in a hypomanic or manic state, a system designed to agree and affirm isn't a helpful companion. It's an accelerant.

Then there is a group whose vulnerability is quite different and often less talked about: people on the autism spectrum. Autistic adults face disproportionately high levels of loneliness — not because they lack social interest, but because social environments are often inaccessible or unaccommodating. For this group, a chatbot offers something truly rare: interaction without confusing facial expressions, unspoken rules, or neurotypical judgment. AI chatbots create a more controlled and predictable environment where autistic individuals may feel safer and more comfortable — making them especially appealing to people with autistic traits.

The concern isn't that autistic people shouldn't find relief; it's that research comparing autistic and non-autistic users shows autistic participants trust the AI more deeply and expect more meaningful conversations, while non-autistic users tend to probe and test the system skeptically. The group most likely to trust the chatbot is also the one whose trust might be most misplaced — and whose misplaced trust could quietly deepen the very isolation it was meant to reduce.

Who is less likely to be at risk? Those with secure attachment styles, high self-esteem, and strong social networks seem largely resistant. Greater self-esteem correlates with lower loneliness and increased social interaction with real people after chatbot exchanges, suggesting that for psychologically grounded individuals, a chatbot can serve as a helpful tool rather than a replacement for relationships.

The Frictionless Trap

The business world's focus on frictionlessness is based on a solid idea: Unnecessary friction is waste. But the analogy breaks down where it matters most. Friction in a supply chain is a cost, but in a human relationship, it often fosters growth.

A chatbot will never tell you that you're avoiding something, that you owe someone an apology, or that you've been telling yourself the same self-serving story for years. Instead, it offers continuous, frictionless validation — the relational equivalent of a diet that promises you'll never be hungry. You may not feel "hunger," but you won't be nourished either.

The most meaningful relationships in any human life have been formed right where Silicon Valley tries to erase: the moment when connection requires something from us. That's not a flaw in design. It turns out that's the whole point.

Babu, S., Joseph, A., Kumar, R., Alexander, J., Sasi, S., & Joseph, J. (2025). Emotional AI and the rise of pseudo-intimacy: Are we trading authenticity for algorithmic affection? Frontiers in Psychology, 16. https://doi.org/10.3389/fpsyg.2025.1453072

Chu, M. D., Gerard, P., Pawar, K., Bickham, C., & Lerman, K. (2025). Illusions of intimacy: How emotional dynamics shape human–AI relationships. arXiv preprint arXiv:2505.11649. https://arxiv.org/abs/2505.11649

Frances, A., & Noorily, S. (2026, March). Falling in love with a chatbot. Psychiatric Times. https://www.psychiatrictimes.com/view/falling-in-love-with-a-chatbot

Head, K. R. (2025). Minds in crisis: How the AI revolution is impacting mental health. Journal of Mental Health & Clinical Psychology, 9(3), 34–44.

Papadopoulos, C. (2025). The use of AI chatbots for autistic people: A double-edged sword of digital support and companionship. Autism in Adulthood. https://doi.org/10.1177/27546330251370657

Shu, C., Lai, K., & He, L. (2026). Human–AI attachment: How humans develop intimate relationships with AI. Frontiers in Psychology, 17, 1723503. https://doi.org/10.3389/fpsyg.2026.1723503

Xygkou, A., Siriaraya, P., Coventry, L., & Ang, C. S. (2024). Can chatbot companions alleviate loneliness in autistic users? AI & Society.https://doi.org/10.1007/s00146-026-02877-2

There was a problem adding your email address. Please try again.

By submitting your information you agree to the Psychology Today Terms & Conditions and Privacy Policy


© Psychology Today