The words you can't say on the internet
There's a secret list of words you can't say on social media – at least, that's what everyone seems to think.
Perhaps you've noticed that people avoid certain words on social media. They'll say "unalived" instead of "killed". Guns are "pew pews". Consenting adults have "seggs" with each other. Social media users are the first to admit this makes them sound ridiculous. But many think they don't have a choice.
Algospeak, as it's often called, is a whole coded language built around the idea that algorithms bury content that uses forbidden words or phrases, either to boost the political agendas of social media companies, or to sanitise our feeds for advertisers.
The tech industry swears this is all nonsense. A YouTube spokesperson named Boot Bullwinkle explains it plainly. "YouTube does not have a list of banned or restricted words," he tells the BBC. "Our policies reflect our understanding that context matters and words can have different meanings and intent. The efficacy of this nuanced approach is evident from the diversity of topics, voices and perspectives seen across YouTube." Meta and TikTok said the same thing: we never do this, it's a myth.
The truth, however, is more complicated.
History is littered with examples of social media companies quietly manipulating what content rises and falls, sometimes in ways that contradict their claims about transparency and neutrality. Even if it doesn't come down to individual words, experts say the tech giants do step in to subtly curb some material.
The problem is you never know why a post fails. Did you say something that upset the algorithms, or did you just make a bad video? The ambiguity has encouraged a widespread regime of self-censorship. On one end of the spectrum, the result is people talking about serious subjects with goofy language. But at the extremes, some users who just want to go viral avoid certain topics altogether.
In a world where social media is the main source of news and information for a growing share of the public, it could mean there are ideas that some people never get to hear.
Just ask Alex Pearlman. He's a content creator with millions of followers across TikTok, Instagram and YouTube who hang around for his comedy and biting political takes. Pearlman says algorithmic censorship is a constant presence in his work.
"Just to start off with just TikTok alone, I rarely say the word ‘YouTube'. At least in my experience, if I'm looking at my analytics, if I say the phrase like, 'go to my YouTube channel', the video's going to [fail]," Pearlman says. He isn't alone. Experience has led Pearlman and other creators to assume TikTok doesn't want you sending people to a competitor and it will smack you down for suggesting it. (TikTok, by the way, says it doesn't do things like this.)
But sometimes, Pearlman says, the examples are more unsettling.
Pearlman has made a lot of videos about Jeffrey Epstein, the late financier and sex offender at the centre of controversies around powerful figures from business and politics. But last August, he noticed something strange.
"This was right around the time that Epstein stuff was blowing up everywhere," he says. "Out of nowhere, I had multiple Epstein videos taken down on TikTok on a single day." The same videos were untouched on Instagram and YouTube, but they'd broken some TikTok rule he couldn't identify. "It's not like they come in and highlight the sentence that violated the guidelines. You're kind of left trying to discern what the black box is telling you." Pearlman says his appeals were denied and TikTok left "strikes" on his account, which threaten your ability to make money on the app.
"Shortly after that, we started seeing less big-name accounts talking directly about Epstein as much," he says. According to Pearlman, it seemed like other creators had similar problems and were trying to please the algorithms. He didn't stop making Epstein videos, but Pearlman did try another strategy. "I started speaking about him in coded language, calling him ‘the Island Man'," he says, in reference to Epstein's notorious private island. "The problem with coded language is a large part of the audience won't know who you're talking about," Pearlman says.
I got on the phone with a TikTok spokesperson. They didn't comment on Pearlman's Epstein problem and declined to speak on the record. But they sent over some background information. In short, TikTok says it is a misconception which doesn't reflect how its platform works.
TikTok, Meta and YouTube all say the algorithms that control your feed are complex, interconnected systems that use billions of data points to serve content you'll find relevant and satisfying – and all three publish information to explain how these systems........





















Toi Staff
Gideon Levy
Sabine Sterk
Tarik Cyril Amar
Mort Laitner
Stefano Lusa
John Nosta
Ellen Ginsberg Simon
Gilles Touboul
Mark Travers Ph.d
Daniel Orenstein