Max More's take on the Singularity
Bruce Klein 28 Dec 2002
Max More, Tom W. Bell, and Simon Levy:
founding Directors of Extropy Institute
Photo Credit
July 2002
Interview with Max More
by Sander Olson
What is your opinion of the technological Singularity?
I've studied the concept for quite a while, and I'm somewhat a skeptic of the idea. I do believe that there will be a Singularity in a sense-- in my recent debate with Ray Kurzweil, I refer to the concept of a surge, rather than a single point, 2035 or whatever. Singularity enthusiasts see some incredible Singularity event, all at once, which changes everything radically. I think that is probably unlikely. I think, for instance, that simple projections of computer power are misleading; you also have to take into account social factors and economic factors. There is often a lag between the invention of a technology, and the effect that it has on people.
This is true with every major technology, and the lag can generally be measured in decades. Digital computers have been around for decades, but it was only in the last few years that we saw an effect on productivity. It's only now that we organize work to take advantage of the new technology. So Hans Moravec argues that we'll have human-level intelligence in 2030 because we should have the hardware then; I just don't think that is very plausible, looking at the historical situation. I see more of a series of surges -- biotech surges, machine intelligence surges, nanotech surges -- and I don't necessarily see all of the surges occurring at the same time. These surges will be difficult to deal with, but I don't think there will be one day when everything suddenly and radically changes.
One reason why I'm skeptical of the concept of a Singularity is that to me it rings of Christian doctrine; the rapture of the future if you like. I think that there is a strong psychological/cultural pull for that outlook, which is another reason to be suspicious. Those concerns aside, the Singularity notion is a handy way to express a whole range of technological trends that cannot realistically be expected to be linear in nature.
Article
Lazarus Long 28 Dec 2002
One reason why I'm skeptical of the concept of a Singularity is that to me it rings of Christian doctrine; the rapture of the future if you like. I think that there is a strong psychological/cultural pull for that outlook, which is another reason to be suspicious.
I thought I was reading my own words.
I see more of a series of surges -- biotech surges, machine intelligence surges, nanotech surges -- and I don't necessarily see all of the surges occurring at the same time. These surges will be difficult to deal with, but I don't think there will be one day when everything suddenly and radically changes.
Any good student of history notices quickly that it is full of various "Chicken Littles" screaming "The End is Nigh" or The Lord is Coming" before a comet or a millenium (remember Y2K? ) and these dates come and go with little to actually mark their passage. Rarely does the future happen according to a schedule.
I like John Lennon's take on this, "Life is what happens while you are making other plans". Don't expect the expected when we talk about the "Singularity", anticipate the unexpected.
When we talk about the event as something beyond the Earth and what I call the GAIA conscience then we have Einstein standing before us like the Collosus of Rhodes both bearing a light to guide us and blocking the path into a channel.
sponsored ad
thefirstimmortal 30 Dec 2002
Once the Singularity is reached, Vinge believes our old models will have to be discarded because of the emergence of a new reality. The pace of technological change will become so rapid and so profound that it will rupture the basic fabric of human history.
Raymond Kurzweil, who embraces Vinge’s Singularity thesis, adds that as the exponential growth of technology continues to accelerate into the first half of the twenty-first century, it will appear to explode into infinity, at least from the limited, linear perspective of contemporary humans. The progress of technical change will ultimately become so fast that it will leave behind our ability to follow it.
The Singularity will transform every aspect of human life: social, sexual, and economic. In Kurzweil’s view, the emergence early in the twenty-first century of a new form of intelligence on Earth that can compete with, and ultimately significantly exceed, human intelligence will be a development of greater import than any of the events that have shaped human history.
Mind 30 Dec 2002
DJS 16 Jan 2003
One last thought, and bear with me on this one. It is a well known fact that Christmas was established as Decemember 25 because it fell within the Roman winter festival. By having Christmas coincide with the festival it made it easier for Romans to make the transition from polytheism to Christianity. Could the concept of a Singularity make the transition from Christianity to trans-humanism more palatable? Or should we be purists and let the logic of our argument be the deciding factor? I guess it all depends on whether you believe the ends justify the means. I am always looking for "conversion angles". [wacko]
Edited by Kissinger, 16 January 2003 - 11:34 AM.
MichaelAnissimov 16 Jan 2003
If any of you IT guys want to argue with me be my guess.
You asked for it! [ggg] Thanks you kindly for the invitation. But really, I haven't written more than a few thousand lines of programming code since I began using a computer, and I've only attended one IT-related class in my life. What's up with the stereotype?
Singularity=techno-rapture.
Only in the same sense that George Bush = Christian Devil. (Some folk may make the comparison, but the analogy doesn't have very much predictive value.)
It seems to be too locked into the historical template of grand design.
Interesting how the laws of physics don't care too much about this. Have you generally considered the physics and systems theory of a brain, and thought about what might happen if that brain were accelerated a trillionfold or a quadrillionfold, and given full self-access, and a catalog of millions of other self-enhancement tricks that humans don't have?
Call me old school but I like my body thank you very much! If any of you IT guys want to argue with me be my guess. The only practical value I can see is the possible augmenting of my intelligence.
I don't see how these beliefs of yours effect the likelihood of the Singularity actually happening one way or another. Who told you the Singularity necessarily means relinquishing your biological body? The only danger I see for your biological body would be if an unFriendly Singularity occured, wiping out all of humanity. That's why us Singularitarians care about safeguarding the *integrity* of the Singularity, rather than focusing on other issues and pretending that AI won't bother us simply because we don't care enough about it at the moment.
My main concern is the integrity of myself as a biological entity. Ah, I am revealing that I have a slightly conservative nature.
Then that means you probably don't want any malevolent or human-indifferent superintelligences running around. Which means actively directing the creation of the first greater-than-human intelligence towards benevolence, right? Conservative, radical, whatever; we're one humanity, all in the same boat.
I would probably let someone else "jump" into the singularity first and then tell me if it was alright to come on in.
Will do.
Personally, I am more interested in using uploading as a backup to my biological entity. AKA, if I get run over by a truck a real time copy of me will be ready to go.
Yes - but in actuality, uploading doesn't have a huge amount to do with the Singularity. We're already partially uploaded right now by sending our thoughts as data streams over the Internet, and using a software interface which "doesn't really exist" to navigate amongst the forum of knowledge, which also doesn't really exist physically. If this More article gave you the idea that Singulartarians are necessarily uploaders, or something along those lines, then I'm afraid to say that the More article is based on an outdated or superficial understanding of what the Singularity effort really is.
Is there the possiblity of a singularity? Sure, but this conjecture on future hypotheticals feels like mental masturbation to me. I do believe that a singularity is neccesary to ensure true immortality because of the potential for accidents in the physical world.
Whether or not a "Singularity" happens in the sense presented in this article, the creation of superintelligence with an arbitrarily large capacity to do good or harm towards humans is inevitable within the next 40 years (more like 20 or 10). In what way is preparing for this "mental masturbation"? We're actually doing stuff, not simply basking in the spirit-lifting glow of the idea.
One last thought, and bear with me on this one. It is a well known fact that Christmas was established as Decemember 25 because it fell within the Roman winter festival. By having Christmas coincide with the festival it made it easier for Romans to make the transition from polytheism to Christianity. Could the concept of a Singularity make the transition from Christianity to trans-humanism more palatable? Or should we be purists and let the logic of our argument be the deciding factor? I guess it all depends on whether you believe the ends justify the means. I am always looking for "conversion angles".
I would consider this astoundingly unethical and self-defeating.
Kissinger, welcome to the forums, have you visited http://singinst.org yet?
DJS 17 Jan 2003
Self defeating? Don't be so idealistic. I am always looking for angles. You forget, the majority of civilized society is not as intellegent as us. They need a simple message that can give them hope. History proves that is how Revolution works; and if you think that the changes we are proposing will take place with anything less than a Revolution you are mistaken.
sponsored ad
Thomas 19 Jan 2003
Yes, a lot of very intelligent folk, like Kurzweil or More, seems to not see, this sudden discontinuity, after even a subhuman GAI level.however once we have human level intelligence in a machine ... the "cat is definitely out of the bag". At that point things are going to take off
I am puzzled why is it so.
If they advocated, that a human level of AI is in a year 2100, I would understand.
But they don't. They permit, that it may happen even before 2030, but still, they insist on some decades long transition.
Kurzweil, for example, sees that we are going to expand, with the maximal possible speed outward. Most probably with the speed of light.
But still ... he predict some initial delay, between the SAI emergence and the Singularity.
Maybe, Ray Kurzweil and Max More don't want to scare people too much. [B)]
On the other hand, pushing the Singularity into the next few years is not very realistic. I wouldn't be surprised if happens, but the most likely, it will not.
I think, in 2010, we will have quite a good idea of WHEN.