AI has created a new form of sexual abuse
There’s a lot of debate about the role of technology in kids’ lives, but sometimes we come across something unequivocally bad. That’s the case with AI “nudification” apps, which teenagers are using to generate and share fake naked photos of their classmates.
At Issaquah High School in Washington state, boys used an app to “strip” photos of girls who attended last fall’s homecoming dance, according to the New York Times. At Westfield High School in New Jersey, 10th grade boys created fabricated explicit images of some of their female classmates and shared them around school. Students from California to Illinois have had deepfake nudes shared without their consent, in what experts call a form of “image-based sexual abuse.”
Now advocates — including some teens — are backing laws that impose penalties for creating and sharing deepfake nudes. Legislation has passed in Washington, South Dakota, and Louisiana, and is in the works in California and elsewhere. Meanwhile, Rep. Joseph Morelle (D-NY) has reintroduced a bill that would make sharing the images a federal crime.
Francesca Mani, a 15-year-old Westfield student whose deepfaked image was shared, started pushing for legislative and policy change after she saw her male classmates making fun of girls over the images. “I got super angry, and, like, enough was enough,” she told Vox in an email sent via her mother. “I stopped crying and decided to stand up for myself.”
Supporters say the laws are necessary to keep students safe. But some experts who study technology and sexual abuse argue that they’re likely........
© Vox
visit website