menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

AI-Images, Real World Harms

91 1
20.01.2026

Grok, the generative AI tool integrated with Elon Musk's X, isn’t technically a dedicated “undressing app.” Yet lax and inconsistent safeguards meant that over the past few weeks, it became abundantly clear that users could generate scantily clad or nearly nude images with a simple prompt.

The result has been that sexualized images, including images of children, have been widely shared on a mainstream platform. Rather than simply hosting harmful user-generated content, the platform is actively producing nonconsensual sexual imagery.

Grok’s latest internet scandal has triggered significant pushback and legal scrutiny. UK media regulators have already opened a formal investigation, and other countries are threatening to suspend X. Some U.S. senators are urging Apple and Android to remove X from their app stores. In response, X initially limited these AI image requests to paid subscribers, but now says it has implemented measures to prevent Grok from undressing images of real people.

Young people deserve more than a world where a company’s first instinct is to monetize the ability to create and share sexualized images of real children and non-consenting adults. Research on image-based sexual abuse already tells us that the non-consensual distribution of intimate or sexual material can have significant psychological and emotional consequences. Images of real people suddenly clad in bikinis, underwear, or clear plastic undergarments may be AI-generated, but the anger, embarrassment, fear, and powerlessness that follow are very real.

Women and girls are disproportionately targeted. These images can become flash points for........

© Psychology Today