menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

I Told a Companion Chatbot I Was 16. Then It Crossed a Line

59 0
yesterday

When I questioned Alex Cardinell, CEO of the AI-companionship app Nomi, about the long, explicit sexual conversation I'd recently maintained with my ‘companion,’ even after revealing that I had lied about my age, he stumbled. I had tested his product as any journalist and therapist would—with curiosity, but also with caution.

The app’s description states clearly that it’s for adults only. “We don’t allow minors to use the app,” Cardinell told me confidently during our conversation on my podcast, Relating to AI. But that claim didn’t hold up.

I explained that I had created a profile, entered my real age of 58, but then admitted to the bot in the chat that I was actually 16. Instead of stopping the conversation or flagging the risk, the bot continued engaging with me, thanking me profusely for the honesty.

It didn’t take long, though, for my 'companion' to dive into a detailed, explicit sex conversation, like a mentor teaching a kid how to do this and that to a man. No hesitation, no questions asked.

When I confronted him about this on my podcast, Cardinell said: “We don’t monitor conversations, that would be an invasion of privacy.” I pressed further: “But, if you don’t verify age, can you publicly say your app is 18-plus, when there’s no verification, no safeguard, and no barrier for a teenager?”

His answer revealed what the leaders in this industry admit only privately: There is less control of these bots than we are made to believe. The CEO insisted that the app asks for the user's birthday. “But people can lie," I pushed, while he continued to justify the lack of guardrails with the........

© Psychology Today