California fails to protect families from AI exploitation
A teenager sits in her room, illuminated by her phone, well past her bedtime, even though she has school in the morning. The thing capturing her attention? Not a new video game. Not “brain rot” social media content. Not even friends from school. Rather, an AI companion, a chatbot designed to mimic affection, affirmation, and intimacy. The company that built the app promises “a friend who’s always there,” one that never judges, never gets tired, never pulls away.
At first glance, it may seem harmless. Some parents may even reluctantly accept this new reality, better an AI friend than no friend at all. However, beneath that reassurance lies a dangerous truth. For the first time in human history, people at every stage of life, from children to the elderly, are being invited to form “relationships” with machines instead of with each other. And unlike calculators, calendars, or other tools, companion AI is built to replace the very bonds that sustain families.
OPENAI ANNOUNCES SPECIAL CHATGPT FOR TEENAGERS AHEAD OF SENATE HEARING ON AI CHATBOT HARM
These chatbots are engineered to act almost human, remembering details from conversations, showing simulated emotion, offering constant validation. It feels comforting, but this false intimacy traps........
© Washington Examiner
