Anti-cheat, gamers, and the Crowdstrike disaster

The constituency for trusted computing is fracturing.

Cory Doctorow
10 min readSep 16, 2024

On September 24th, I’ll be speaking in person at the Boston Public Library!

Politics requires coalitions, and coalitions are brittle. Good politics is often a matter of fracturing an opposing coalition — and, better yet, breaking off parts of a rival coalition and adding it to your own.

Think of the US “realignment” after the passage of the Civil Rights Act in 1964, which saw the “Dixiecrats” (southern Democrats) defect en masse to the Republicans — or think of all the reliably Democratic “rustbelt” precincts that swung for Trump in 2016 (and, to a lesser extent, in 2020).

Coalitional realignments are wellsprings of political change, for good or ill. In her 2023 must-read book Doppelganger, Naomi Klein provides a clear-eyed look at how liberals sidelined leftists in the progressive coalition, and how that opened space for the right to peel off working people and get Trump elected:

https://pluralistic.net/2023/09/05/not-that-naomi/#if-the-naomi-be-klein-youre-doing-just-fine

And the right relies on racebaiting and other appeals to cruelty and hatred to get millions of low-income people to elect politicians who will make their rich bosses even richer, at their expense:

https://pluralistic.net/2022/03/09/turkeys-voting-for-christmas/#culture-wars

But coalitions aren’t always manufactured. The most powerful, transformative coalitions coalesce out of a recognition of real, shared interests. As Jamie Boyle has written, the term “ecology” turned a thousand issues into one movement, fusing people who cared about endangered owls with people campaigning on the ozone layer, by giving them a word (“ecology”) that brought all those issues under one umbrella:

https://pluralistic.net/2020/08/29/chickenized-home-to-roost/#chickenizers-come-home-to-roost

Today, we’re seeing a growing “Privacy First” movement that recognizes that so many of America’s ills start with the country’s poor privacy protections (Congress last updated consumer privacy law in 1988, when they banned video-store clerks from telling newspapers which VHS cassettes you rented). Whether you’re worried about Big Tech brainwashing, AI deepfake porn, red state Attorneys General following teens into abortion clinics, or cops rounding up the identities of protesters with “reverse-warrants” served on Google, you’re really worried about privacy:

https://pluralistic.net/2023/12/06/privacy-first/#but-not-just-privacy

Whenever you see a coalition forming — or rupturing — you’ve found a leading indicator of change. Today, I found one of those indicators: Microsoft has announced that it is abandoning “kernel-level anti-cheat”:

https://www.notebookcheck.net/Microsoft-paves-the-way-for-Linux-gaming-success-with-plan-that-would-kill-kernel-level-anti-cheat.888345.0.html

This is esoteric stuff, but it has far-reaching implications for some of our most important tech and policy questions, from mass surveillance to monopoly, from digital catastrophe to Dieselgate.

About 20 years ago, a team researchers at Microsoft had a weird and (mostly) terrible idea. They figured out a way to allow computers to send information about their configurations over the internet to untrusted third parties, in such a way that the owners of the sending computers couldn’t falsify that information:

https://pluralistic.net/2020/12/05/trusting-trust/#thompsons-devil

This is called a “remote attestation.” It allows a server to make sure that it’s sending data to an app on a phone, and not a computer program on a laptop that’s pretending to be be an app on a phone. This has beneficial uses (the encrypted messaging app Signal uses remote attestations so that the Signal program on your phone can assure itself that it’s talking to a computer that’s running an unaltered version of the Signal server app, which makes it much harder to spy on Signal conversations):

https://signal.org/blog/private-contact-discovery/

But it’s also got a lot of harmful and anticompetitive uses. For example, remote attestation can be used to ensure that you’re not running an ad-blocker or other privacy-preserving technology, forcing you to choose between accessing a service and protecting your privacy:

https://pluralistic.net/2023/08/02/self-incrimination/#wei-bai-bai

Remote attestation — and “trusted computing,” the underlying subsystem that makes it possible — gives manufacturers the power to order your device to lie to you, and also prevents your device from lying to them on your behalf (say, by promising them that you haven’t disabled the trusted computing module even after you have). This is key to everything from HP forcing you to pay $10,000/gallon for ink to Tesla lying to you about your car’s range:

https://pluralistic.net/2023/07/28/edison-not-tesla/#demon-haunted-world

And while there are plenty of potentially good applications for a module in your computer that you — the computer’s owner — can’t examine or override, these are vastly outweighed by the risks associated with such a module. For one thing, if someone can figure out how to install malicious software in that module, then — by definition — you can’t know what it’s doing, or prevent it from doing it:

https://pluralistic.net/2022/07/28/descartes-was-an-optimist/#uh-oh

And boy, do malware authors loooove to infect your computer’s trusted computing module with undetectable, unstoppable viruses:

https://pluralistic.net/2024/01/18/descartes-delenda-est/#self-destruct-sequence-initiated

All of these risks and abuses were foreseeable in the early 2000s, when Microsoft was teaming up with the semiconductor industry to roll out trusted computing. To get trusted computing in spite of the problems it would unarguably create, Microsoft needed a coalition of the willing for trusted computer. That’s where gamers came in.

As a class, Gamers hate digital rights management (DRM), the anti-copying, anti-sharing code that stops gamers from playing older games, selling or giving away games, or just playing games:

https://www.reddit.com/r/truegaming/comments/1x7qhs/why_do_you_hate_drm/

Trusted computing promised to supercharge DRM and make it orders of magnitude harder to break — a promise it delivered on. That made gamers a weird partner for the pro-trusted computing coalition.

But coalitions are weird, and coalitions that bring together diverging (and opposing) constituencies are very powerful (if fractious), because one member can speak to lawmakers, companies, nonprofits and groups that would normally have nothing to do with another member.

Gamers may hate DRM, but they hate cheating even more. As a class, gamers have an all-consuming hatred of cheats that overrides all other considerations (which is weird, because the cheats are used by gamers!). One thing trusted computing is pretty good at is detecting cheating. Gamers — or, more often, game servers — can use remote attestation to force each player’s computer to cough up a true account of its configuration, including whether there are any cheats running on the computer that would give the player an edge. By design, owners of computers can’t override trusted computing modules, which means that even if you want to cheat, your computer will still rat you out.

As a class, gamers can be relied upon to endorse measures that make cheating harder, even when that wrecks other things they care about. That’s something we learned the hard way in 2007, when Blizzard — the giant games company that’s now part of (who else?) Microsoft — sued some hobbyists who’d made a software program called “bnetd”:

https://www.eff.org/cases/blizzard-v-bnetd

Bnetd was a standalone replacement for Blizzard’s Battlenet servers. Battlenet servers connected up gamers for multiplayer games. Among other things, Battlenet servers allowed Blizzard to check the license key for every player (rejecting players who used keys that were circulating on pirate boards), which is something Blizzard really cared about. Battlenet also enforced anti-cheating measures, something that players (even players who didn’t care whether their opponents were using valid license keys) valued highly.

The bnetd authors were motivated by another factor altogether: Battlenet sucked. The servers were slow, laggy, and crashed. Bnetd allowed players to stand up their own servers, and to imbue those servers with “house rules” about how the gameplay would run. After all, the difference between “house rules” and “cheating” is whether all the players agree. Sometimes, those house rules turn into great games, like Portal, which started out as a user’s mod to Half-Life:

https://half-life.fandom.com/wiki/Development_History_of_Portal

Blizzard went after bnetd for two reasons: first, to make life worse for players with invalid license keys; and second, because bnetd was making them look incompetent. I wasn’t surprised that Blizzard attacked bnetd — but what did surprise me was the sheer number of gamers who leapt to Blizzard’s defense. Theoretically, these players were class allies of the bnetd authors, whose project was to give them more reliable and flexible servers to play on. But once Blizzard successfully framed the bnetd lawsuit as a way to stamp out cheating, large numbers of gamers switched sides and fought against their own interests.

Time and again, “we’re stamping out cheating” has proven itself to be a reliable way to recruit gamers to fight against their own interests. A couple years after bnetd, gamers once again leapt to Blizzard’s defense in its lawsuit over an automation program called “Glider”:

https://arstechnica.com/gaming/2009/01/judges-ruling-that-wow-bot-violates-dmca-is-troubling/

Though Glider had many uses, many World of Warcraft players saw it as a cheating tool, and this trumped every other consideration, including the fact that Blizzard was advancing a theory that would have far-reaching implications for games, and every other kind of digital service or device. Blizzard’s Glider theory would make it a crime to refill your printer cartridges or use a third-party garage-door opener, but the fact that Blizzard cloaked itself in anti-cheating trumped those considerations.

Since then, gamers have maintained split personality when it comes to DRM. They hate DRM for their games, but they love trusted computing for its anti-cheat. When games companies go to the US Copyright Office to oppose efforts to preserve old games, they use anti-cheat as their pretext, citing gamers’ hatred of cheating in support of their position:

https://www.theverge.com/2022/3/21/22988902/nintendo-wiiu-3ds-eshop-closure-dmca-section-1201

Which brings me back to Microsoft’s announcement about “kernel-level anti-cheat.” “Kernel-level anti-cheat” is exactly the kind of thing that Microsoft promised when it started building the coalition for trusted computing and remote attestation. The “kernel” is an extremely powerful, low-level part of your computer’s operating system, and programs that patch the kernel can operate without fear of interference from other programs. When malware successfully infiltrates a kernel (as with a “rootkit”), it can hide itself perfectly from antivirus software:

https://en.wikipedia.org/wiki/Sony_BMG_copy_protection_rootkit_scandal

One of the primary — and often beneficial — goals of trusted computing is to detect and prevent kernel modifications. But there are classes of programs that Microsoft grants an exception to when it comes to kernel access. Most notoriously, Microsoft allowed Crowdstrike’s “endpoint protection” software privileged access to the kernel, affording it the ability to detect and interdict malicious programs. This worked great, but boy did it fail badly: Crowdstrike’s privileged access to the kernel meant that a minor programming error (and major defects in Crowdstrike’s testing and deployment procedure) paralyzed the world and did hundreds of billions of dollars’ worth of damage:

https://pluralistic.net/2024/07/22/degoogled/#kafka-as-a-service

The Crowdstrike disaster evidently prompted a great deal of soul-searching within Microsoft as they reconsidered the wisdom of whitelisting other companies’ third-party code to run in their kernel. And if preventing malicious software doesn’t rise to the level of granting exceptional access to the kernel, should anti-cheat for games be given kernel access?

Hence this turn: a decision to take anti-cheat out of the kernel. This is going to have lots of knock-on effects. For one thing, it’s going to make it infinitely easier to support multiplayer games on GNU/Linux computers (which gamers are far more likely to own or be open to trying), as well as handhelds and consoles running on free/open source software.

But most important from my perspective is that Microsoft is excising gamers from the trusted computing coalition. Gamers were always an unlikely fit, given their general antipathy to DRM and their general interest in free/open source software, software preservation, modding, and all the other activities that DRM — and remote attestation — interferes with. The fact that so many gamers could be lured into the pro-trusted computing coalition with promises of better anti-cheat was key to the spread of trusted computing into so many of our devices. That fact is changing.

Those gamers are now in play for the anti Trusted Computing coalition.

That’s a coalition that needs all the help it can get, because Trusted Computing is a key part of the enshittification strategies of every shitty company in the world. Take Apple, whose CEO has openly admitted that if his customers can fix their phones (rather than throwing them away), his company’s profitability will suffer:

https://www.apple.com/newsroom/2019/01/letter-from-tim-cook-to-apple-investors/

No wonder Apple leads the coalition against Right to Repair. Apple makes liberal use of trusted computing to block repair: through “parts pairing,” Apple can prevent your phone from working with parts harvested from dead phones. Parts pairing uses trusted computing to allow a device (like an Iphone) to detect when a real, non-counterfeit part has been installed by a third-part technician, and refuse to access that part until an authorized tech types in an unlock code.

Parts pairing started in the automotive industry (where it was called “VIN locking”) and spread to tractors, phones, and many other kinds of devices:

https://pluralistic.net/2022/05/08/about-those-kill-switched-ukrainian-tractors/

Parts pairing is such an effective way to block repair that it’s turned into an epidemic — so much so that’s it’s been banned in Oregon:

https://9to5mac.com/2024/03/28/parts-pairing-oregon/

Apple has learned precisely the wrong lesson from this ban. Rather than seeing it as a signal to abandon parts pairing, Apple has just announced that it will do even more parts pairing:

https://mashable.com/article/apple-tamp-down-stolen-iphone-parts-new-ios18-feature

They claim they’re doing this to prevent “stolen” Iphone parts from entering the parts stream. Apple says they’re not doing this to protect themselves from customers who want to fix their phones rather than sending them to a landfill — rather, they’re doing it to prevent theft. But if that’s true, then there’s a much simpler way to kill the market for stolen parts: sell official parts to third-party technicians.

With Microsoft kicking gamers out of the trusted computing coalition, life will get harder for Apple and everyone else who relies on trusted computing to screw their customers, block competition, and fill the land and sea with immortal, toxic e-waste. Coalitional reconfigurations may seem obscure and esoteric, but if you care about how the world works, they’re the key to making change.

If you’d like an essay-formatted version of this post to read or share, here’s a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:

https://pluralistic.net/2024/09/16/gamer-gate/#descartes-revenge

--

--