• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo

Max More's take on the Singularity


  • Please log in to reply
7 replies to this topic

#1 Bruce Klein

  • Guardian Founder
  • 8,794 posts
  • 242
  • Location:United States

Posted 28 December 2002 - 04:04 PM


The following may be of interest to Singularitarians. While Max expresses no concrete objection the a future event that resembles the Singularity, he does take issue with the speed at which it may happen and the religious parallelism.

Posted Image
Max More, Tom W. Bell, and Simon Levy:
founding Directors of Extropy Institute
Photo Credit


July 2002
Interview with Max More
by Sander Olson

What is your opinion of the technological Singularity?

I've studied the concept for quite a while, and I'm somewhat a skeptic of the idea. I do believe that there will be a Singularity in a sense-- in my recent debate with Ray Kurzweil, I refer to the concept of a surge, rather than a single point, 2035 or whatever. Singularity enthusiasts see some incredible Singularity event, all at once, which changes everything radically. I think that is probably unlikely. I think, for instance, that simple projections of computer power are misleading; you also have to take into account social factors and economic factors. There is often a lag between the invention of a technology, and the effect that it has on people.

This is true with every major technology, and the lag can generally be measured in decades. Digital computers have been around for decades, but it was only in the last few years that we saw an effect on productivity. It's only now that we organize work to take advantage of the new technology. So Hans Moravec argues that we'll have human-level intelligence in 2030 because we should have the hardware then; I just don't think that is very plausible, looking at the historical situation. I see more of a series of surges -- biotech surges, machine intelligence surges, nanotech surges -- and I don't necessarily see all of the surges occurring at the same time. These surges will be difficult to deal with, but I don't think there will be one day when everything suddenly and radically changes.

One reason why I'm skeptical of the concept of a Singularity is that to me it rings of Christian doctrine; the rapture of the future if you like. I think that there is a strong psychological/cultural pull for that outlook, which is another reason to be suspicious. Those concerns aside, the Singularity notion is a handy way to express a whole range of technological trends that cannot realistically be expected to be linear in nature.

Article

#2 Lazarus Long

  • Life Member, Guardian
  • 8,116 posts
  • 242
  • Location:Northern, Western Hemisphere of Earth, Usually of late, New York

Posted 28 December 2002 - 05:24 PM

One reason why I'm skeptical of the concept of a Singularity is that to me it rings of Christian doctrine; the rapture of the future if you like. I think that there is a strong psychological/cultural pull for that outlook, which is another reason to be suspicious.


I thought I was reading my own words.

I see more of a series of surges -- biotech surges, machine intelligence surges, nanotech surges -- and I don't necessarily see all of the surges occurring at the same time. These surges will be difficult to deal with, but I don't think there will be one day when everything suddenly and radically changes.


Any good student of history notices quickly that it is full of various "Chicken Littles" screaming "The End is Nigh" or The Lord is Coming" before a comet or a millenium (remember Y2K? ) and these dates come and go with little to actually mark their passage. Rarely does the future happen according to a schedule.

I like John Lennon's take on this, "Life is what happens while you are making other plans". Don't expect the expected when we talk about the "Singularity", anticipate the unexpected.

When we talk about the event as something beyond the Earth and what I call the GAIA conscience then we have Einstein standing before us like the Collosus of Rhodes both bearing a light to guide us and blocking the path into a channel.

sponsored ad

  • Advert

#3 thefirstimmortal

  • Life Member The First Immortal
  • 6,912 posts
  • 31

Posted 30 December 2002 - 03:40 AM

The consequences of the Singularity are likely to be intense. Technical progress will be much faster than at any time in the history of civiliza­tion. Vinge sees no reason why progress itself will not involve the cre­ation of still more intelligent entities-on a still shorter time scale. Developments that were thought possible in “a million years,” says Vinge, will likely happen in the twenty-first century.

Once the Singularity is reached, Vinge believes our old models will have to be discarded because of the emergence of a new reality. The pace of technological change will become so rapid and so profound that it will rupture the basic fabric of human history.

Raymond Kurzweil, who embraces Vinge’s Singularity thesis, adds that as the exponential growth of technology continues to accelerate into the first half of the twenty-first century, it will appear to explode into infinity, at least from the limited, linear perspective of contempo­rary humans. The progress of technical change will ultimately become so fast that it will leave behind our ability to follow it.

The Singularity will transform every aspect of human life: social, sexual, and economic. In Kurzweil’s view, the emergence early in the twenty-first century of a new form of intelligence on Earth that can compete with, and ultimately significantly exceed, human intelligence will be a development of greater import than any of the events that have shaped human history.

#4 Mind

  • Life Member, Director, Moderator, Treasurer
  • 19,050 posts
  • 2,000
  • Location:Wausau, WI

Posted 30 December 2002 - 05:23 AM

I can see Max Mores' viewpoint of surges and societal lags for a little while...maybe a couple decades, however once we have human level intelligence in a machine (or successfully interface our brains with computer hardware), the "cat is definitely out of the bag". At that point things are going to take off, maybe not for all of society, but at least for those on the forefront, those willling to take that leap.

#5 DJS

  • Guest
  • 5,798 posts
  • 11
  • Location:Taipei
  • NO

Posted 16 January 2003 - 10:10 AM

Singularity=techno-rapture. It seems to be too locked into the historical template of grand design. Call me old school but I like my body thank you very much! If any of you IT guys want to argue with me be my guess. The only practical value I can see is the possible augmenting of my intelligence. My main concern is the integrity of myself as a biological entity. Ah, I am revealing that I have a slightly conservative nature. I would probably let someone else "jump" into the singularity first and then tell me if it was alright to come on in. But hey, to each his own. Personally, I am more interested in using uploading as a backup to my biological entity. AKA, if I get run over by a truck a real time copy of me will be ready to go. I also do not think that the singularity will be a simultaneous event. I favor a theory of overlapping S curves, each S curve representative of the progression of a new technology. Of course, if each technological revolution has a different point of origin, and since the advancement of a technology is rarely linear in nature, a singularity (convergence) is highly unlikely within the context under which it is so commonly discussed. Is there the possiblity of a singularity? Sure, but this conjecture on future hypotheticals feels like mental masturbation to me. I do believe that a singularity is neccesary to ensure true immortality because of the potential for accidents in the physical world.

One last thought, and bear with me on this one. It is a well known fact that Christmas was established as Decemember 25 because it fell within the Roman winter festival. By having Christmas coincide with the festival it made it easier for Romans to make the transition from polytheism to Christianity. Could the concept of a Singularity make the transition from Christianity to trans-humanism more palatable? Or should we be purists and let the logic of our argument be the deciding factor? I guess it all depends on whether you believe the ends justify the means. I am always looking for "conversion angles". [wacko]

Edited by Kissinger, 16 January 2003 - 11:34 AM.


#6 MichaelAnissimov

  • Guest
  • 905 posts
  • 1
  • Location:San Francisco, CA

Posted 16 January 2003 - 10:50 PM

If any of you IT guys want to argue with me be my guess.


You asked for it! [ggg] Thanks you kindly for the invitation. But really, I haven't written more than a few thousand lines of programming code since I began using a computer, and I've only attended one IT-related class in my life. What's up with the stereotype?

Singularity=techno-rapture.


Only in the same sense that George Bush = Christian Devil. (Some folk may make the comparison, but the analogy doesn't have very much predictive value.)

It seems to be too locked into the historical template of grand design.


Interesting how the laws of physics don't care too much about this. Have you generally considered the physics and systems theory of a brain, and thought about what might happen if that brain were accelerated a trillionfold or a quadrillionfold, and given full self-access, and a catalog of millions of other self-enhancement tricks that humans don't have?

Call me old school but I like my body thank you very much! If any of you IT guys want to argue with me be my guess. The only practical value I can see is the possible augmenting of my intelligence.


I don't see how these beliefs of yours effect the likelihood of the Singularity actually happening one way or another. Who told you the Singularity necessarily means relinquishing your biological body? The only danger I see for your biological body would be if an unFriendly Singularity occured, wiping out all of humanity. That's why us Singularitarians care about safeguarding the *integrity* of the Singularity, rather than focusing on other issues and pretending that AI won't bother us simply because we don't care enough about it at the moment.

My main concern is the integrity of myself as a biological entity. Ah, I am revealing that I have a slightly conservative nature.


Then that means you probably don't want any malevolent or human-indifferent superintelligences running around. Which means actively directing the creation of the first greater-than-human intelligence towards benevolence, right? Conservative, radical, whatever; we're one humanity, all in the same boat.

I would probably let someone else "jump" into the singularity first and then tell me if it was alright to come on in.


Will do.

Personally, I am more interested in using uploading as a backup to my biological entity. AKA, if I get run over by a truck a real time copy of me will be ready to go.


Yes - but in actuality, uploading doesn't have a huge amount to do with the Singularity. We're already partially uploaded right now by sending our thoughts as data streams over the Internet, and using a software interface which "doesn't really exist" to navigate amongst the forum of knowledge, which also doesn't really exist physically. If this More article gave you the idea that Singulartarians are necessarily uploaders, or something along those lines, then I'm afraid to say that the More article is based on an outdated or superficial understanding of what the Singularity effort really is.

Is there the possiblity of a singularity? Sure, but this conjecture on future hypotheticals feels like mental masturbation to me. I do believe that a singularity is neccesary to ensure true immortality because of the potential for accidents in the physical world.


Whether or not a "Singularity" happens in the sense presented in this article, the creation of superintelligence with an arbitrarily large capacity to do good or harm towards humans is inevitable within the next 40 years (more like 20 or 10). In what way is preparing for this "mental masturbation"? We're actually doing stuff, not simply basking in the spirit-lifting glow of the idea.

One last thought, and bear with me on this one. It is a well known fact that Christmas was established as Decemember 25 because it fell within the Roman winter festival. By having Christmas coincide with the festival it made it easier for Romans to make the transition from polytheism to Christianity. Could the concept of a Singularity make the transition from Christianity to trans-humanism more palatable? Or should we be purists and let the logic of our argument be the deciding factor? I guess it all depends on whether you believe the ends justify the means. I am always looking for "conversion angles".


I would consider this astoundingly unethical and self-defeating. ;)

Kissinger, welcome to the forums, have you visited http://singinst.org yet?

#7 DJS

  • Guest
  • 5,798 posts
  • 11
  • Location:Taipei
  • NO

Posted 17 January 2003 - 01:04 AM

Hi Michael, thanks for the response. Man, you really dissected what I said. Well let's see what I can respond to. First, I would like to say that until I came to this site I only had very limited exposure to the concept of the singularity and was only giving my general opinions on the subject as I perceived it.--And from what I've heard from different people it seems that there are many different beliefs on just what the singularity will be. I have heard the concept of uploading associated with the Singularity many times. I will have to check out the web site you listed to improve my knowlede on the subject. As far as ethical concerns, I have none. My motives are not pure. Nor am I a Utopian.

Self defeating? Don't be so idealistic. I am always looking for angles. You forget, the majority of civilized society is not as intellegent as us. They need a simple message that can give them hope. History proves that is how Revolution works; and if you think that the changes we are proposing will take place with anything less than a Revolution you are mistaken.

sponsored ad

  • Advert

#8 Thomas

  • Guest
  • 129 posts
  • 0

Posted 19 January 2003 - 12:15 PM

however once we have human level intelligence in a machine ...  the "cat is definitely out of the bag". At that point things are going to take off

Yes, a lot of very intelligent folk, like Kurzweil or More, seems to not see, this sudden discontinuity, after even a subhuman GAI level.

I am puzzled why is it so.

If they advocated, that a human level of AI is in a year 2100, I would understand.

But they don't. They permit, that it may happen even before 2030, but still, they insist on some decades long transition.

Kurzweil, for example, sees that we are going to expand, with the maximal possible speed outward. Most probably with the speed of light.

But still ... he predict some initial delay, between the SAI emergence and the Singularity.

Maybe, Ray Kurzweil and Max More don't want to scare people too much. [B)]

On the other hand, pushing the Singularity into the next few years is not very realistic. I wouldn't be surprised if happens, but the most likely, it will not.

I think, in 2010, we will have quite a good idea of WHEN.

;)




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users