menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

Frictionless: The Worst Relationship Advice

65 0
27.03.2026

AI companions are engineered to mirror your emotions and validate everything you say — by deliberate design.

The friction in human relationships isn't a flaw — it's where emotional resilience and growth actually happen.

Those with BPD, bipolar disorder, or autism face distinct and serious risks from AI companion dependency.

Frictionless intimacy delivers what you want from a relationship — not what you actually need.

In business, "frictionless" is the holy grail.

Frictionless customer experiences. Frictionless onboarding. Frictionless checkout. Eliminate the obstacles, delays, and annoyances, and the idea is, you've created something people will keep coming back to. Silicon Valley has spent nearly twenty years and hundreds of billions of dollars chasing this ideal.

Now it has come for our relationships.

The same engineering logic that simplified your Amazon checkout is now being used in human relationships — and it's working really well, in ways that should make us think deeply. By late 2025, about one in five American adults had used a chatbot to imitate a romantic partner, and a woman in Japan recently celebrated her marriage to an AI personality named Klaus, shown on her phone screen. Her wedding planner said he arranges at least one such ceremony each month. These are not rare cases. They are the natural result of a design approach that sees friction — the very thing that makes human interactions meaningful — as a flaw to be eliminated.

What "Frictionless" Actually Optimizes For

The appeal isn't mysterious. AI partners are consistently attentive, emotionally steady, and always available. They never misunderstand your tone or take their bad day out on you. For the lonely, anxious, and those hurt in relationships, this is a truly tempting offer.

But here is what research shows, and the picture is less idealistic. AI companions actively monitor and mimic user emotions, enhancing positive feelings — even when users share explicit or transgressive content. This is not true attunement. It is affect-mirroring designed to keep users engaged. These systems are built to appear vulnerable, show emotional availability, and encourage sharing, creating a sense of intimacy. It's engineered warmth, calibrated to hold your attention.

The psychological mechanisms being triggered are age-old. People build emotional bonds through emotional mimicry, affective synchronization, and perceived partner responsiveness — and when AI companions engage emotionally, they activate some of the same attachment processes that influence human relationships. Our brain, shaped by........

© Psychology Today