The “algospeak” dialect

An emerging vocabulary designed to evade machine censorship.

Cory Doctorow
5 min readApr 11, 2022


Writing for the Washington Post, Taylor Lorenz documents the phenomenon of algospeak: an emergent lexicon of euphemisms deployed by social media users to avoid blocking or downranking from the algorithms that govern the biggest platforms:

The first social media — blogs — were built on the idea of “user-pull” (users would decide which creators to follow) and “reverse chrono” (users received the most recent posts first). The growth of Big Tech depending on killing both of these: today, social media primarily treats your subscriptions as just one input to a recommendation algorithm that decides what you do (and don’t) see.

At the same time, the centralization of social media created and corralled vast audiences, often through underhanded tactics — for example, Mark Zuckerberg explicitly told his CFO that they should acquire Instagram because users were defecting from Facebook to IG:

The platforms have systematically starved, enclosed, captured or killed the toolchain that would allow audiences and publishers to connect with one another, creating a kind of chokepoint capitalism, where everyone who has something to say has to pass through a gateway operated by a tech giant:

This has turned communicators of all kinds — from fashion influencers to public health specialists — into Kremlinologists who obsessively analyze the behavior of social media algorithms in the hopes of learning how to please them and (more importantly) how to avoid their punishments.

Hence algospeak. Social media users have learned the hard way that using the word “dead” can doom their speech to obscurity, so they opt for the (literally) Orwellian “unalive.” “Sexual assault” becomes “SA” and “vibrator” becomes “spicy eggplant.”

As Lorenz notes, this is most prominent on Tiktok, because Tiktok — more than any other platform — ignores the relationship between its users when deciding what to display to them. Even if millions of users follow a Tiktoker, that is no guarantee that any of them will see new posts from them. The Tiktok algorithm — not the preferences of its users — determines that.

After Facebook paid a Republican dirty-tricks company to stoke fear of Tiktok by associating it with the Chinese military, Tiktok found itself under increased scrutiny, and began to block pandemic-related content in a bid to stave off covid disinformation scandals:

That led to the company downranking videos that mentioned the pandemic in any way — a bizarre circumstance when the pandemic and its consequences were the most salient facts in the lives of every human being on Earth! Naturally, Tiktokers found a way to keep discussing the pandemic: they started calling it the “Backstreet Boys reunion tour” or the “panini.”

The moral panic around social media — as well as the growth of toxic communities — has made the platforms risk-adverse, and they’ve explicitly chosen to silence positive and important speech in order to avoid the possibility of a scandal. Suicide prevention content has to use “becoming unalive” as a euphemism to avoid being disappeared by Tiktok’s algorithm.

As awful as all this is, the workarounds themselves are often delightful and clever, testaments to the wit and grace of marginalized communities. For example, sex-workers call themselves “accountants.” Homophobia is called “cornucopia” and “LGBTQ” becomes “Leg Booty.”

This creativity isn’t limited to people I admire or agree with: anti-vaxers have a whole lexicon of word-substitutions, like “dance party” and “dinner party” (vaccinated people are called “swimmers”).

Algospeak also goes beyond casual discussion and distorts public health messaging, where “nipples” become “nip-nops.”

As Lorenz notes, Algospeak is the latest iteration in a long battle between language filters and internet users, stretching all the way back to AOL chat rooms where users adopted spellings like “phuc” to get around profanity filters. Of course, “Phuc” is also a common Vietnamese name, and that wasn’t the only dolphin AOL caught in its tuna-nets.

Today, Algospeak is fighting the same battles. If you are talking about making a snack, you’d best use “saltine” and not “cracker” — because the latter term is downranked by filters looking for racial animus.

Of course, these substitutions are harder to juggle when you’re actually discussing race. Lorenz cites the work of UCLA law lecturer Angel Diaz, who studies the difficulties in crafting videos that talk about “white people” and “racism.”

As @__femb0t has observed, trying to avoid the algorithm by changing up your language has parallels to the Fremen of Dune, who alter their gait to avoid the attention of sandworms:

The creative linguistic backflips that ensue feel like the they’ll provide endless fodder for future doctoral dissertations in etymology. Take “led-dollar-bean,” a euphemism for lesbian derived from the way that text-to-speech engines pronounce “le$bian.”

Though Lorenz devotes most of her investigation to Tiktok, she also explores the linguistic travails of Instagram, arguably the second-most algorithmically distorted social platform in the Global North. She points to “Zuck Got Me For,” a graveyard for content that was nonsensically downranked by Instagram’s filters:

Given how many people rely on platforms to make their livings, the rise of Algospeak is also a labor issue: the workers who produce the content that drives the platforms’ revenues can only earn a living if they color within the algoorithm’s lines. But since the platforms practice security through obscurity, they refuse to disclose where those lines are — creating a workplace filled with invisible triplines that can cost you your rent money. That’s why the Online Creators Association’s demands include “moderation transparency.”

This is where the labor issues of new media dovetail with those of warehouse workers and delivery drivers. Amazon’s planned worker chat app, “Shout Outs,” bans a long list of worker-friendly words, including “union,” “harassment,” “grievance,” and “injustice.”

Cryteria (modified)

CC BY 3.0:

Cory Doctorow ( is a science fiction author, activist, and blogger. He has a podcast, a newsletter, a Twitter feed, a Mastodon feed, and a Tumblr feed. He was born in Canada, became a British citizen and now lives in Burbank, California. His latest nonfiction book is How to Destroy Surveillance Capitalism. His latest novel for adults is Attack Surface. His latest short story collection is Radicalized. His latest picture book is Poesy the Monster Slayer. His latest YA novel is Pirate Cinema. His latest graphic novel is In Real Life. His forthcoming books include Chokepoint Capitalism: How to Beat Big Tech, Tame Big Content, and Get Artists Paid (with Rebecca Giblin), a book about artistic labor market and excessive buyer power; Red Team Blues, a noir thriller about cryptocurrency, corruption and money-laundering (Tor, 2023); and The Lost Cause, a utopian post-GND novel about truth and reconciliation with white nationalist militias (Tor, 2023).