menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

The Mad Religion of Technological Salvation

14 0
22.08.2025

Image by Logan Voss.

A science journalist and PhD astrophysicist, Adam Becker spent the last several years investigating the futurological vision of our tech-bro masters – and found it’s a bunch of nonsense.

The targets in his new book More Everything Forever are the names we’ve come to associate with the great leaps forward in the march of what Becker calls the religion of technological salvation: Sam Altman, genius progenitor of artificial intelligence; Elon Musk, technocrat extraordinaire and SpaceX honcho; space-colonization guru and Amazon godhead Jeff Bezos; Mark Andreesen, billionaire software engineer, venture capital investor, and avowed technofascist; and Facebook co-founder Dustin Moskovitz, who has donated hundreds of millions of dollars to so-called “longtermist” think tanks that provide the ideological ground for the religion of tech salvation. He also aims his guns at Ray Kurzweil, grand wizard of software engineering, inventor, planner of the immortalizing digitalization of human affairs called “the Singularity;” Eliezer Yudkowsky, a writer and researcher who attributes fantastical (but as yet non-existent) powers to artificial general intelligence, or AGI; and the crew of longtermist tech apologists at Oxford University on whom Moskovitz and other Valley barons have lavished funding.

What unites these players is lust for power and control based in the seduction that technology will solve all humanity’s problems and overcome the human condition. “These ideas offer transcendence,” writes Becker. “Go to space, and you can ignore scarcity of resources…Be a longtermist, and you can ignore conventional morality, justifying whatever actions you take by claiming they’re necessary to ensure the future safety of humanity. Hasten the Singularity, and you can ignore death itself.”

Musk and Bezos’s “power fantasies” of space colonization and visions of “AI immortality” will usher in a future of unlimited wealth and resources, beyond the confines of Earth, the solar system, the galaxy. Ray Kurzweil’s dream of the Singularity involves the uploading of minds into digital simulations, so we can live forever. All of this, Becker says, is a divorced-from-reality sales pitch driven by the primordial fear of death. Overarching it is what’s called “engineer’s disease”: the mental derangement of believing that engineering can solve anything and everything.

In Becker’s telling, for example, Kurzweil is an unhinged fantasist manically attempting to resurrect his dead father into an artificial intelligence “Dad Bot.” Like a Christian apocalyptic prophet, the high priest of the church of tech salvation promises the Singularity to arrive as early as 2045, when AI computing becomes so fast and so powerful it will transform society, Earth, and the universe itself to overcome “cosmological forces,” including time and aging, the laws of physics and entropy. All existence would become one giant computer spinning forever out across the vastness of space. “The objective is to tame the universe, to make it into a padded playground,” writes Becker. “Nobody would age, nobody would get sick, and – above all else – nobody’s dad would die.”

“The promise of control is total,” he explains, “especially for those who know how to control computers. This is a fantasy of a world where the single most important thing, the thing that literally determines all aspects of reality, is computer programming. All of humanity, running on a computer…”

It’s the ultimate revenge of the nerds, made worse because of our subservience to their immense money and overhyped influence. What to do in answer? Understand the authoritarian nature of these zealots, so we can repulse their attempts at the takeover of society and shatter into bits the armatures of their loony-tune machines. As Becker puts it, channeling Orwell’s 1984: “If you want a picture of [the] future, imagine a billionaire’s digital boot stamping on a human face – forever.”

I spoke with Becker recently via Zoom about his book. Our conversation has been edited for length and clarity.

Ketcham: Let’s start with what inspired you to write this book. Like, why go after Sam Altman, Ray Kurzweil, Bezos, Musk, the whole techno-optimist crowd?

Becker: I’ve been following these sorts of subcultures – longtermists, general techno-optimism, Singularity stuff – for a very long time. I’m a science fiction junkie and first encountered a lot of these ideas in science fiction in high school or earlier. I think I first heard of Ray Kurzweil in college. And I thought, oh, yeah, these ideas are bad, but they don’t seem to be getting a lot of traction. And then the funniest thing happened: tech billionaires took this stuff seriously, giving these people a lot of money. I moved out to the Bay Area about 13 years ago. And of course, this is ground zero. I realized how deep in the culture this stuff is, these things like the singularity and AI hype, the idea that technology is going to solve every single problem, we’ll go to space and that will solve every single problem. I was amazed at how uncritical and ubiquitous the acceptance of these ideas was out here. I thought, you know, this is ridiculous. The other thing is, when I saw people going after these ideas I didn’t see a detailed scientific breakdown of why these things don’t work. There were a lot of people who dismissed people like Yudkowsky or Kurzweil just out of hand, but they would be like, Oh, this is ridiculous. Why? Usually the answer from the analysis was it’s ridiculous because it’s an insane fantasy. Yes, it is an insane fantasy. Why? I thought, well, there’s not enough actual analysis because people are not taking these ideas seriously outside of these communities. What people don’t seem to realize is these communities are becoming bigger and more influential. So even though their ideas are sort of prima facie ridiculous, we have to engage with them because they are gaining more power. Fundamentally, that’s where the impulse for the book came from.

Ketcham: So what drives this zealous acceptance by the technocrats of what you describe as prima facie ridiculous ideas?

Becker: Because it provides all kinds of excuses for them to do and say the things that they already want to do and say, and that makes these ideas really appealing and compelling and persuasive for them. Because that’s the way human psychology works. If something provides an excuse for you to do a thing you want to do anyway, it increases the chances that you genuinely believe it, because it’s so convenient to believe it. It makes the world simple. It provides a sense of direction and meaning. It lets them see themselves as the hero of the story of humanity, that they’re going to save us by taking us all to space and letting us live forever with an AI god. They’re going to be the people who usher in a permanent paradise for humanity. What could be more important than that? And of course, all of that’s nonsense. But it’s like if somebody came down out of the sky and said, you are the chosen one, you are Luke fucking Skywalker, here’s your lightsaber, all you have to do is believe everything that I tell you and you will be seen as a hero. Anybody told you that that person was lying? You and I are used to thinking critically, but tech billionaires don’t think that way. They’re not really in the habit of thinking at all. They don’t have to, because thought is something that requires a lot of effort and critical self-examination. And if you have all of your needs that you could ever possibly have taken care of, and the only thing left is this fearful pissing contest of who has the most billions of dollars, then why would you stop to question yourself? There’s no reason to, and everybody around you is going to tell you that everything you’re doing is right, because you’re a billionaire. You surround yourself with sycophants.

Ketcham: What you describe is, of course, a religion, in that it provides all the various salutary, mentally assuaging elements of religion – meaning, purpose, direction, a god of sorts.

Becker: And it even provides, in some cases, a kind of community.

Ketcham: Right. Not an unimportant thing. Let’s talk about the religion of technological salvation. The religion long predates this movement, no? You could almost go back to the Cartesian vision of the world, Enlightenment science, this idea that science and knowledge will lead to the ultimate perfection of the world. Tell me how the current iteration of the religion of tech salvation fits into the history of industrial society.

Becker: That’s a really good question. But I want to be clear. I think science is great. And I think that it is true that science has brought about really amazing things. It’s also brought about horrors. It gave us vaccines, but it also gave us thermonuclear weapons. And I think that that’s about the scale, right? Vaccines are arguably the best thing that science has ever done. And thermonuclear weapons are, I think, pretty indisputably the worst thing that science has ever enabled. But science is ultimately a tool. And just like the rest of technology, it’s a tool that is subject to human choice and contingency. What scientific truths we discover, that’s not really up to us. What technology we build off of the scientific advances that we’ve made, that is up to us. Technology is not preset on some sort set of rails or like a tech tree out of a video game. And so that means that there is no inevitable future of technology. Technology can enable us to do things that we previously couldn’t, but which things it enables are a combination of the constraints placed on us by nature and human choice. The narrative that technology will inevitably lead us to a utopia or inevitably lead us to apocalypse, these are just stories that we tell. The idea that it will lead to a utopia, as you said, is an old one. The specific version of this ideology of technological salvation that the tech oligarchs and their kept intellectuals and the subcultures........

© CounterPunch