More Everything Forever

Science fiction is an allegory, not a prediction.

Cory Doctorow
4 min readApr 22, 2025
The Basic Books cover for Adam Becker’s ‘More Everything Faster.’

I’m on a 20+ city book tour for my new novel Picks and Shovels. Catch me at NEW ZEALAND’S UNITY BOOKS in AUCKLAND (May 2, 6PM) and WELLINGTON (May 3, 3PM). More tour stops (Pittsburgh, PDX, London, Manchester) here.

Astrophysicist Adam Becker knows a few things about science and technology — enough to show, in a new book called More Everything Forever that the claims that tech bros make about near-future space colonies, brain uploading, and other skiffy subjects are all nonsense dressed up as prediction:

https://www.hachettebookgroup.com/titles/adam-becker/more-everything-forever/9781541619593/

Becker investigates the personalities, the ideologies, the coalitions, the histories, and crucially, the grifts behind such science fictional pursuits as infinite life-extension, space colonization, automation panic, AI doomerism, longtermism, effective altruism, rationalism, and conciousness uploading.

This is, loosely speaking, the bundle of ideologies that Timnit Gebru and Émile P. Torres dubbed TESCREAL (transhumanism, Extropianism, singularitarianism, (modern) cosmism, Rationalism, Effective Altruism, and longtermism):

https://en.wikipedia.org/wiki/TESCREAL

While these are largely associated with modern Silicon Valley esoteric techbros (and the odd Oxfordian like Nick Bostrom), they have very deep roots, which Becker excavates — like Nikolai Fyodorov’s 18th century “cosmism,” a project to “scientifically” resurrect everyone who ever lived inside of a simulation:

https://en.wikipedia.org/wiki/Nikolai_Fyodorov_(philosopher)

In their modern incarnation, these ideas largely originate in science fiction novels. That is to say, they were made up and popularized by people like me, the vast majority of whom made no pretense of being able to predict the future or even realistically describe a path from the present to the future they were presenting. Science fiction is something between a card trick and a consensual con game, where the writer shows you just enough detail to make you think that the rest of it must be lurking somewhere in the wings. No one in sf has ever explained how consciousness uploading could possibly work, and neither have any of the advocates for consciousness uploading — the difference is that (most of) the sf writers know they’re just making stuff up.

Becker’s central question is how many “smart” people (some of them very smart and accomplished, others merely very certain that they are smart despite all evidence to the contrary) can mistake futuristic allegories made up by pulp writers for prophesy?

In answering this question, he uncovers a corollary of Upton Sinclair’s famous maxim that “it is difficult to get a man to understand something, when his salary depends on his not understanding it,” namely, that “it is easy to get a person to believe something when doing so will make them feel good about themselves.”

The beliefs that Becker explores in this book sometimes make the believers rich (like the AI grifters who run around shouting about AI taking over the world and turning us all into paperclips). Sometimes, they make their believers feel good about being selfish assholes (like longtermism, which holds that all the misery in the world today is worth it if you can make 24 heptillion hypothetical simulated people just a little happy in 10,000 years). Sometimes, they make their believers feel good about life after death, or eternal life — the same pitch that religions have been roping in followers with since the stone age.

What differentiates these beliefs from other faith-based claims is that their followers claim that they aren’t operating on faith, but on science, reason and rationality. This is where the fact that Becker is a bona fide astrophysicist comes in. Not only is he personally qualified to debunk claims about space colonization, but he’s also familiar with the rigorous process of scientific inquiry, and capable of consulting experts and listening to them. That’s how he concludes, for example, that having your head cut off and frozen when you die is just a form of corpse mutilation, with a zero point zero zero zero zero percent chance of someone recovering your mind from your freezerburned brain.

Like his subjects, Becker has a complicated relationship with science fiction. He, too, enjoys the imaginative flights of the genre, its delightful thought-experiments, its gnarly moral conundra. I love these too. They make for a fascinating and often useful lens for understanding and challenging our own relationship with technology and our very humanity. Ultimately, Becker is exploring the difference between reading sf because it makes you think in new ways, and reading sf as a kind of prophetic text, and — crucially — he’s asserting that it’s perfectly possible to enjoy this stuff without organizing your moral life around hypothetical heptillions of virtual people living in the year 25,000; or, indeed, having your head cut off and frozen.

If you’d like an essay-formatted version of this post to read or share, here’s a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:

https://pluralistic.net/2025/04/22/vinges-bastards/#cyberpunk-is-a-warning-not-a-suggestion

--

--

Cory Doctorow
Cory Doctorow

Written by Cory Doctorow

Writer, blogger, activist. Blog: https://pluralistic.net; Mailing list: https://pluralistic.net/plura-list; Mastodon: @pluralistic@mamot.fr

Responses (13)