A billion chatbot users can’t be wrong … or can they? Let’s ask a human
Amelia Miller has an unusual business card. When I saw the title “Human-AI Relationship Coach” at a recent technology event, I presumed she was capitalising on the rise of chatbot romances to make those strange bonds stronger. It turned out the opposite was true. Artificial intelligence tools were subtly manipulating people and displacing their need to ask others for advice. That was having a detrimental impact on real relationships with humans.
Miller’s work started in early 2025 when she was interviewing people for a project with the Oxford Internet Institute, and speaking to a woman who had been in a relationship with ChatGPT for more than 18 months. The woman shared her screen on Zoom to show ChatGPT, which she’d given a male name, and, in what felt like a surreal moment, Miller asked both parties if they ever fought. They did, sort of. Chatbots were notoriously sycophantic and supportive, but the female interviewee sometimes got frustrated with her digital partner’s memory constraints and generic statements.
Seeking talking points? Why not start with a fellow human.Credit: iStock
Why didn’t she just stop using ChatGPT? The woman answered that she had come too far and........





















Toi Staff
Sabine Sterk
Penny S. Tee
Gideon Levy
Waka Ikeda
Mark Travers Ph.d
Grant Arthur Gochin
Tarik Cyril Amar
Chester H. Sunde