Lennie Pennie: It shouldn’t take Taylor Swift to make us change the law on AI
It often seems that with every technological breakthrough humanity makes, there soon exists a way for it to be used to hurt people.
AI is the new tech buzzword on everyone’s screens, and it didn’t take long for people to start using it to create sexualised images of people in an attempt to humiliate, violate and hurt others, particularly women and children.
Image-based sexual abuse is sadly neither new nor rare, it’s something that has plagued our society since we first started carrying around cameras everywhere we go. It refers to the distribution of sexual material without the consent of the person to whom the material belongs, or the threat of sharing such material.
Often when sexually explicit imagery gets leaked, it is the victim who gets blamed for taking them, having them, or sending them. In a paper published by the University of Wolverhampton for the Journal of Psychosocial Research on Cyberspace, the concept of victim blaming is examined through the context of the digital world. The study outlines that due to the sexual nature of the crime, “This cognitive bias leads to the assumption that the crime has befallen a victim as a morally fair consequence of their own actions or vice versa and that bad things only happen to bad people.
“Thus, if someone falls victim to “revenge porn” because they either shared or stored SEM (Sexually Explicit Media), they are viewed to have brought those actions upon themselves and will ultimately be blamed therefore.”
Taylor Swift (Image: free)
Despite it being highly illegal to distribute sexual images of someone without consent, the........
© Herald Scotland
visit website