My AI-cloned voice was used to spread far-right propaganda. How do we stop the fake audio scam?
My brother held his phone up to my ear. “You’re gonna find this creepy,” he warned. An Instagram reel showing a teenage boy at a rally featured a voiceover in the style of a news broadcast. A calm, female voice, with an almost imperceptible Mancunian accent, said: “The recent outcry from a British student has become a powerful symbol of a deepening crisis in the UK’s educational system.” I sat bolt upright, my eyes wide open.
As a presenter for a YouTube news channel, I was used to hearing my voice on screen. Only this wasn’t me – even if the voice was indisputably mine. “They are forcing us to learn about Islam and Muhammad in school,” it continued. “Take a listen. This is disgusting.” It was chilling to hear my voice associated with far-right propaganda – but more than that, as I dug further into how this scam is perpetrated, I discovered just how far-reaching the consequences of fake audio can be.
AI voice cloning is an emerging form of audio “deepfake” and the third fastest-growing scam of 2024. Unwitting victims find their voice expertly reproduced without their consent or even knowledge, and the phenomenon has already led to bank security checks being bypassed and people defrauded into sending money to strangers they believed were relatives. My brother had been sent the clip by a........
© The Guardian
visit website