[Excerpted from Transcendence: The Disinformation Encyclopedia of Transhumanism and the Singularity by R.U. Sirius and Jay Cornell]
Empowering individuals and transcending what were long considered human limits: these goals are exciting to some, but they’re disturbing and frightening to others. Let’s put major objections to transhumanism into one or more of four categories: that it’s unfeasible, directly dangerous, indirectly dangerous, or immoral.
“IT WON’T WORK!”
It’s easy to be skeptical about technological predictions. After all, nobody commutes by personal helicopter or nuclear-powered automobile, or has a kitchen robot cooking dinner. Nuclear power never made electricity “too cheap to meter,” and controlled fusion has been twenty years away for about sixty years now . . . and still is.
Some say that mind uploading is impossible, pointing to the assumptions implicit in the concept: the idea that “you” (your self, mind, or soul) is something distinct from your body.This philosophical dualism is seen as inconsistent with the materialism transhumanists otherwise profess.
But let’s assume it’s perfected. You’re on your deathbed with a terminal illness, your mind is copied to a computer, and your body dies. Are “you” now in the computer? Or are you dead, and what’s in the computer is just a copy of “you”? That’s nice for the copy that’s happy to be there (one hopes), but it’s not much consolation for the old you that was in your body. So mind uploading can be seen as an ontological fail.
Core aspects of The Singularity, and Kurzweilian techno-optimism in general, have many critics. Paul Allen points out that as we gain deeper knowledge of natural systems like human intelligence, our theories become increasingly complex. Moore’s Law notwithstanding, the Law of Accelerating Returns thus runs up against this “complexity brake.” Since the concept of artificial intelligence is central to The Singularity, the complexity brake may delay it, if not make it forever unobtainable.
“IT’S DANGEROUS AND COULD KILL US ALL!”
All tools can be used for good or evil, and transhumanist technology has a number of what might be called direct dangers. 3D printers can print unregistered guns for the wrong people. Nanotech molecular assemblers could become tabletop drug labs. Mind enhancement could be used by criminals to enhance their criminal abilities. Biotechnology could be used to create terrorist bioweapons.
Some fear that GMOs could disrupt the world’s food supply. Self-replicating nanobots could get out of control and consume every- thing on Earth: the “grey goo” scenario. A powerful, post-Singularity artificial intelligence could decide to exterminate humanity—the Skynet scenario from the Terminator movies.
“IT’S SOCIALLY DESTRUCTIVE!”
The indirect dangers of transhumanism are at least as numerous. Some- times the fears are about society or the state being damaged by out-of-control technology in the hands of individuals. At other times, the fears are that society or the state will gain too much power over individuals.
Although the old-fashioned coercive eugenics of the early 20th century progressive and Nazi varieties are long gone, their ghosts still haunt any discussion of transhumanism. Critics warn of socially divisive effects, such as a “genetic divide” of classes based on genetic modifications, as portrayed in the dystopian movie Gattaca or the novel Brave New World. Francis Fukuyama has called transhumanism “the world’s most dangerous idea,” seeing even voluntary genetic improvements as an anti-egalitarian threat to the ideals of liberal democracy.
After all, the benefits of transhumanism aren’t going to happen everywhere and to everyone all at once. Some people will be the first to benefit from brain enhancements and life extension, and those people are likely to be rich and have the right social or political connections. Social cohesiveness could suffer. Imagine mentally enhanced, immortal rich people and politicians, whose enhanced offspring always get the best grades, the best jobs, and win every Oscar, Emmy, and MVP award. Nobody wants a transhuman overclass, except the people who imagine themselves in it.
Or, perhaps even worse, the blurring of the meaning of “human” could lead to the creation of an underclass of semi-humans. Think The Island of Doctor Moreau, Blade Runner replicants, or Cordwainer Smith’s Underpeople.
“IT’S DEHUMANIZING AND PLAYING GOD!”
More philosophical objections come from critics who agree on little else, but see transhumanism as essentially immoral and anti-human.
The traditionalist, conservative attack can be summed up in a sentence from a 2002 Vatican statement: “Changing the genetic identity of man as a human person through the production of an infrahuman being is radically immoral.” For many Christians (and members of other Abrahamic faiths), transcending human limits is a dangerously hubristic idea, because it gives humans powers that should be reserved for God. Transhumanism is seen as a form of “scientism”—a dogma that empirical science is the most authoritative worldview, and that only measurable knowledge is valuable. Thus, transhumanism is a type of idolatry and more “false good news,” a utopian movement trying to immanentize the eschaton (create heaven on earth). Conquering death and creating utopia cannot happen through technological means, they say, but only through God. Furthermore, God doesn’t appreciate the attempts at competition.
There are related attacks from the secular left: transhumanism is seen as “atomized individualism,” power fantasies, and as an expression of contempt for the flesh. Feminists may see extensions of unhealthy cultural obsessions with youth and beauty. Some consider biological enhancements as trivializing human identity, or see an “ableist bias” in even thinking in terms of “improvements” or overcoming mental or physical “limitations.”
POINTS TAKEN, BUT . . .
For some, possible dangers will always trump possible benefits, and alleviating existing poverty or inequality will always take precedence over an advancement that will benefit only the relatively wealthy (at least at first).
Of course, taken to its logical extreme, those views would stop most technological advancements, because they all have some dangers and negative side effects. The rich or the lucky few always benefit first, so types of inequality will temporarily increase.
Besides, being first isn’t always best. Think of those cutting-edge LED watches that cost hundreds or thousands of dollars in the early ’70s. That money helped finance the cheaper and more advanced digital technology that came later, and all those rich people got was a couple of years of looking fashionable, and they had to press a button to see the time.
As we saw with thalidomide and metal-on-metal hip implants, being an early adopter can be risky. So perhaps we should not worry about the rich financing transhumanist biotech. They’ll be the first guinea pigs, and you can get it when it’s cheaper, better, and safer.
If humanity had always viewed technology through the lenses of the precautionary principle and equal access, we’d still be arguing whether to allow this invention called “fire.”
Transcendence: The Disinformation Encyclopedia of Transhumanism and the Singularity is available at Amazon and other good book stores now. More information on the book is available at the official site.
Latest posts by Disinformation (see all)
- Jim Marrs, RIP - Aug 3, 2017
- The Hypnotic Bar Goes Psychedelic - Mar 15, 2017
- Steve Bannon, Julius Evola and ‘The Forbidden Book’ - Feb 14, 2017