Michael Anissimov: Transhumanism Has Already Won

by Socrates on May 11, 2010 · View Comments

I just spent 10 minutes reading this excellent article on Michael Anissimov’s Accelerating Future blog and had to re-post it in full — it is that good!

Enjoy:

Transhumanism Has Already Won

by Michael Anissimov

It’s 2010, and transhumanism has already won. Billions of people around the world would love to upgrade their bodies, extend their youth, and amplify their powers of perception, thought, and action with the assistance of safe and tested technologies. The urge to be something more, to go beyond, is the norm rather than the exception.

At their base, the world’s major two largest religions — Christianity and Islam — are transhumanistic. After all, they promise transcension from death and the concerns of the flesh, and being upgraded to that archetypical transhuman — the Angel. The angel will probably be our preliminary model as we seek to expand our capacities and enjoyment of the world using technological self-modification. Then, even angels will get bored of being angels, and expand outwards in a million new directions, resulting in an explosion of species never before seen — exceeding in magnitude and variation even the Cambrian Explosion of 530 million years ago.

Humanity, as it stands today, is a seed, a bridge. We will plant flowers and trees across the universe. All we have to do is survive our embryonic stage, stay in control of our own destiny, and expand outwards in every direction at the speed of light. Ray Kurzweil makes this point in The Singularity is Near, a book that was #1 in the Science & Technology section on Amazon and on the NYT bestsellers list for a reason.

The mainstream has embraced transhumanism. A movie about using a brain-computer interface to become what is essentially a transhuman being, Avatar, is the highest-grossing box office hit of all time, pulling in $2.7 billion. This movie was made with hard-core science fiction enthusiasts in mind. About them, James Cameron said, “If I can just get ‘em in the damn theater, the film will act on them in the way it’s supposed to, in terms of taking them on an amazing journey and giving them this rich emotional experience.” A solid SL2 film, becoming the world’s #1 film of all time? It would be hard for the world to give transhumanism a firmer endorsement than that.

Everything is Not Alright

I am tremendously sympathetic to transhumanism’s critics and detractors, more so than most transhumanists I have met. Many transhumanists don’t seem to understand that when you step outside of the confines of 3.5 billion years of natural evolution in a few short decades of intense technological progress, there are risks. Risks like new viruses and new weapons, to say the least. Even today, global security is entirely dependent on a few very knowledgeable scientists keeping their mouths shut. They start talking to the wrong people, and suddenly cities aren’t such safe places to live anymore.

People are basically nice when they’re well-fed, and damn evil when they’re hungry. Deprive the world’s cities of the millions of truckloads of fresh vegetables and meat that arrive every day, and suddenly things will get all nasty. The technologies that transhumanists talk about messing with — biotechnology, nanotechnology, artificial intelligence — will force societies to radically restructure or die. I’ve talked to dozens of selfish transhumanists whose response to this is basically, “well, too bad for them!” No. Too bad for you, because they’ll gladly drag you down with them.

Even technologies readily available today, but rarely used — such as the direct electrical stimulation of the pain and pleasure centers of the human brain — could become fearsome new plagues on humanity if in the hands of the wrong political or religious fanatics. The Western world today has a sort of fantasy of invulnerability, like a teenager taking his dad’s NSX for a joyride. Americans, especially, are high and drunk on our country’s prominence in the world. What could possibly go wrong? Radical Islam hates us and all they could achieve is bringing down the World Trade Center.

Imagine a hundred or so tribes in an area covering many hundreds of square miles, quarreling with sticks and stones for hundreds if not thousands of years. Suddenly, they get rifles. In many places around the world, this has already resulted in genocide. That situation is what will happen to humanity as a whole throughout the next fifty years. The Western world is so impressed with its own accomplishments over the past centuries that we don’t realize that there is much, much more to come.

Like it or not, the bedrock of any society rests on security. Legal and financial power are trivialities in comparison to the military power and security that makes the legal and economic machinery possible. History is strewn with thousands of examples where legal and financial “realities” collapsed like a cloud of dust when the security fundamentals were threatened. There’s nothing that will make people stay inside like a bombing or pandemic.

They Like Us?

Mainstream culture around the world has already embraced transhumanism and transhumanist ideals. The question is not whether humanity will move towards a transhumanist future (it will), but how that power is channeled. It’s not hard to convince people to become stronger and healthier if it truly is within their grasp. What we need to worry about is massive power in the hands of individuals with selfish or truly alien and abstract morals.

Good and evil are ideas. Any goal system is ultimately arbitrary. As long as someone can protect themselves from injury or attack, they can do practically anything. This gives us unlimited freedom, but also unlimited peril. Given the ability to modify their own motivations and emotions, selfish people will have the option of becoming even more selfish. Conversely, the altruistic might amplify their compassion.

The Singularity Institute’s strategy for dealing with this challenge is the creation of a recursively self-improving moral agent — safe or “friendly” Artificial Intelligence. This ambition might turn out to be too much — it may be that programming a computer with the subtleties of human-friendly morality is too great a challenge. But, I think we should try. We should try because the first being to cross the line into true transhumanity will probably be an Artificial Intelligence, and we might as well do something to ensure that it is friendly.

Now, it may be that enhanced humans cross the line into transhumanity first. If you’re thinking about that route, consider what it means. Extensive animal testing and risky surgery. Brain implants, which would likely be necessary to achieve the kind of transhumanity that matters, are essentially carefully tuned rocks that we are inserting into proteinaceous tissue. The human skull is a pretty cramped space — there is not a lot of room for additional machinery. To really get a lot out of it, you’d need to push the brain aside or use a fluid-filled cavity, which might not play well with the body. It is highly questionable whether we can get an I/O of sufficient bandwidth relying on electronic devices that just sit on top of the surface of the brain. My guess is that you’d need a lot of extremely thin, deep electrodes going to precise neural areas to get the necessary I/O. This may be harder or easier than it looks, but I certainly wouldn’t want to be anywhere near the first to try it.

If Artificial Intelligence and molecular nanotechnology are not available to meet humanity’s thirst for ascension, people will turn to other routes. Crude surgery and the like. That’s what I’m afraid of — a botched entrance into transhumanity. An entrance where the soldiers, fighters, gangsters, and porn stars lead the way. This is already happening now, and it isn’t all good. When magnified aggression and machismo lead the way into the future, the future becomes uncertain. We’ve seen this story many times before.

Magnify the Good in Us

To survive the future in one piece, humanity has to take those qualities that are the best in us — love, compassion, and altruism — and give them as much muscle as we can. A distributed approach will not work, because historically a few agents grab power and use it as they see fit. Even in democratic societies, this equation isn’t much different, it’s just that the few that get power are those with the most votes. Instead of denying the inevitable concentration of power, we have to do what we can to ensure that that power is used wisely.

Maybe it’s impossible to keep checks on the powerful. If so, we are still at their mercy. Nudging them to do the right thing is better than having no influence whatsoever. At some point during the next century, the most powerful being will be a transhuman. It will be sculpted by whatever process eventually ended up succeeding in producing a true transhuman. We had better hope that process is a safe and sane one.

When people write an article about a problem, it’s usually because they have a ready-made answer they want to sell you. But sometimes the universe just gives us a problem and it has no special obligation to give us an answer. Transhumanity is like that. Whatever answer we come up with may be a little messy, but we have to come up with something, because otherwise the future will play out according to considerations besides global security and harmony. Power asymmetry is not an optional part of the future — it is a mandatory one. There will be entities way more powerful than human. Where will they be born? How will they be made? These questions are not entirely hypothetical — the seeds of their creation are among us now. We have to decide how we want to give birth to the next influx of intelligent species on Earth. The birth of transhumanity will mean the death of humanity, if we are not careful.

Reblog this post [with Zemanta]
  • http://www.SingularitySymposium.com/ Socrates

    I agree with Michael that “Transhumanism Has Already Won.” But, as he admits himself, the real question is: What does that mean for humanity?… The two main options are, of course,
    Immortality….or Extinction….

  • http://topsy.com/trackback?utm_source=pingback&utm_campaign=L1&url=http://singularityblog.singularitysymposium.com/michael-anissimov-transhumanism-has-already-won/ Tweets that mention Michael Anissimov: Transhumanism has already won — Topsy.com

    [...] This post was mentioned on Twitter by Nikola Danaylov. Nikola Danaylov said: According to Michael Anissimov's "Transhumanism Has Already Won" - http://b2l.me/tw88b (via @singularityblog) [...]

  • http://singularityblog.singularitysymposium.com/michael-anissimovs-singularity-podcast-singularity-without-compromise/ Michael Anissimov’s Singularity Podcast: Singularity Without Compromise

    [...] Michael Anissimov: Transhumanism has already won (singularityblog.singularitysymposium.com) [...]

blog comments powered by Disqus

Previous post:

Next post: