• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo

The most intelligent people are Singularitarians


  • Please log in to reply
46 replies to this topic

#31 Richard Leis

  • Guest
  • 866 posts
  • 0
  • Location:Tucson, Arizona

Posted 31 August 2006 - 11:20 PM

You are correct - the universe cannot be ignorant, nor can it care, nor can it do anything like an intelligent entity.

Therefore, intelligent entities cannot define the future of something like the universe. For one thing, posthuman entities will have no desire to do so. For another, the "future" has no meaning for a construct like the universe. We will define nothing.

What I am defining is our relationship to the universe, and in doing so it is difficult not to personify the universe. I apologize for doing so.

#32 Richard Leis

  • Guest
  • 866 posts
  • 0
  • Location:Tucson, Arizona

Posted 31 August 2006 - 11:43 PM

Mean nothing *to whom* exactly?


To anyone.

The great joy of the material universe, to me, is that there is no meaning to our existence. That provided us a great deal of freedom and power to do whatever we like. There are no consequences. The universe in which posthumans exist and the universe in which modern humans are destroyed are for all intents and purposes the same universe.

sponsored ad

  • Advert

#33 RighteousReason

  • Topic Starter
  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 01 September 2006 - 01:46 AM

Therefore, intelligent entities cannot define the future of something like the universe.

Really? You are saying you nor I cannot determine any part of the future whatsoever? I disagree! Haha. Come on, that statement is ludicrous.

For one thing, posthuman entities will have no desire to do so

Oh, so you have a theory about the desires of entities in the distant future that are drastically more intelligent than you are? Please share! No... don't, because again, this is absurd.

For another, the "future" has no meaning for a construct like the universe.

fu-ture  /fy00-cher/
–noun
1. time that is to be or come hereafter.

it is difficult not to personify the universe.

I am sorry that you are plauged with such a disability.

To anyone.

If the power of moving galaxies means nothing to you, you lack a serious grasp of history and physics. I think such a power, in the hands of anyone, would be something of extreme significance and meaning, at least to me, and surely many more, if only out of the sheer sense of awe that genetically mutated apes could have made it so far without going extinct (and surely for more practical reasons, quite obviously).

The great joy of the material universe, to me, is that there is no meaning to our existence.

I am also sorry that you feel this way. I derive a great amount of personal meaning and significance in my own, yours, and every other sentient beings' existence.

There are no consequences.

Cause and effect. You are lost in your words... every event has a consequence.

The universe in which posthumans exist and the universe in which modern humans are destroyed are for all intents and purposes the same universe.

A Universe where humans transcend to posthumanity is the pinnacle of all human intents and purposes. That Universe is not this Universe, here and now.


Why will the integration of the direct computational processes of consciousness with the acceleration of computer power define the future of the Universe? The answer lies in understanding the algorithm of intelligence, as an optimization process, in relation to other optimization algorithms. Compare not the differences between an intelligent human and an ape, or an intelligent human and a bacterium, or a rock. Compare human intelligence to the process of evolution, a known optimization process. Evolution works by exhaustively mutating, blindly and entirely arbitrarily, every facet and state of some reproductive process. The particular distinctions between different reproductive processes are the definition of it's optimization power- the processes that are less effective reproducers are killed off over time. Those that are more effective last longer. Evolution is extremely slow, has no capacity to anticipate the long term success and failure rates of any individual mutations, or collective mutations (to individual repro. processes, interactions between multiple processes, etc), cannot plan ahead, cannot simulate or imagine the space of future possibilities, cannot theorize, hypothesize, or prove generalities. In the blink of an evolutionary eye, humans have had comparatively extreme physical effect on their environment compared to blind, arbitrary evolution, and have exhaustively searched, categorized, imagined, and proved more space of possibility than evolution can obtain, under the most optimal of circumstances, anywhere near as quickly (this is an understatement of the highest order, due to my limited ability to communicate these ideas properly).

Probably the closest metaphor to superintelligence to our current intelligence lies in the comparison of our current intelligence to that of evolution. Just as the power and capability of the human species as a whole has been and is being constantly optimized and is constantly accelerated by our static intelligence, so too will *the algorithm of intelligence itself* be optimized and constantly accelerated in a superintelligence. It's the difference between accelerating the content (not the code) of our consciousness, and accelerating- not just the code!- but even the *hardware* of our consciousness.

The only other way I can think of describe it and not do an unforgiveable injustice to the nature of the idea (the following is not too original *ahem*) is like a black hole in four-dimensional spacetime. The center is an infinite singularity, accelerating so powerfully and assuredly that not even light can escape the event horizon.

The first AI is the all that matters- be it our death or salvation.

#34 thegreasypole

  • Guest
  • 13 posts
  • 0

Posted 01 September 2006 - 02:05 PM

Ditto. I'm with Michael.

Nor does this mean I agree with your strawman "If you don't beleive all singulatarians are the most intelligent you MUST subscribe to the view that we're all equal in intelligence". Obviously that is not true either.

I beleive, as a matter of brute fact, that you'd find singulatarians above average in intelligence....... but that is only because the term "singulatarian" refers to a group whose membership is likely to be selected for intelligence.

I.e. it takes a certain level to of intelligence to understand the concepts that are neccessary for a beleif that the singularity is near that is at or above average. Therefore anyone who describes themself as a person who understands the singularity MUST have this level of intelligence or above.

However, this doesn't mean we are the "most" intelligent...... the level of intelligence required to understand other concepts is much higher than to understand the singularity. Take "Theoretical Physicists" or "Advanced mathematicians". Likely those groups have a higher "intelligence threshold" for entry...... and so, on average, are more intelligent than singulatarians.

To put it simply...... we have to display a basic knowledge of AI, Genetics, Biology and Nanotech to understand the singularity well enough to have confidence it is near. So each of us is likely to have the minimum intelligence to understand those concepts (say an IQ of 100) meaning our average MUST be above this baseline.

"Advanced Mathematicians" have to understand differential calculus, number theory, discrete maths and combinatorices. As I DON'T understand these concepts... and probably cannot in most cases..... their baseline intelligence must be above this level (say an IQ of 110 minimum) meaning their average must be above this baseline.

You can't seriously be suggesting that singulatrians are more intelligent than all other arbitary "groupings" of humans (as you're "we are the MOST intelligent" states........ what about the arbitary grouping of "Nobel Prize Winners" or "Published Mathematicians" or of "Professors" or even "Post-Doctoral researchers".

#35 RighteousReason

  • Topic Starter
  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 01 September 2006 - 02:59 PM

greasypole we've moved on from that.

thanks for your entirely redundant input.

#36 thegreasypole

  • Guest
  • 13 posts
  • 0

Posted 01 September 2006 - 03:22 PM

Fine,

Then what exactly ARE we talking about at present ?

If you want I can pick at your posts as you pick at Richards .....but that won't be getting us anywhere without a point to the conversation. What is it now you've retracted your original statement ?

Or is this just a chance for you to ramble and strut your overly definitive views ?

TGP

#37 RighteousReason

  • Topic Starter
  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 01 September 2006 - 03:27 PM

Just to clarify, I was never looking for a debate on whether or not the title of this thread was a true or false statement.

Right now the topic of debate in this thread is whether or not " the integration of the direct computational processes of consciousness with the acceleration of computer power [will] define the future of the Universe".

Or wherever things get sidetracked to [lol]

#38 thegreasypole

  • Guest
  • 13 posts
  • 0

Posted 01 September 2006 - 03:55 PM

OK. Then we have some nomenkultur to get set up......

What do you mean "future of the universe" ? How do you mean intelligence will "define" it ? I can guess (as I do below) but if I do it' slikely I'll end up talking past your real point.... here's my first pass......... I don't think you can be certain that intelligence will define ANYTHING about the future of our universe.

Certainly there can be theoretical limits to what any level of intelligence, no matter how arbitarily high, can do.

Let's say, for example, that the amount of matter in the universe is sufficient to provide the gravity to cause a Big Crunch in another 15bn years. Whatsmore, lets say that there is no way of removing gravitational effects of mass from the universe only moving them around inside this universe. Both limits are realistic.

In that sceanario the ultimate destiny of the universe....... it's "defined future" is to crunch up all the intelligence in the universe to levels compact enough to end it and nothing the "superhuman" intelligence can do will change that.

Your superhuman intelligence will have defined nothing about the ultimate future of the universe.

Without a complete knowledge of the ultimate physical limits of the universe your argument is meaningless........ it could be that everything (to a sufficiently intelligent being) is a "variable" and superhum,an intelligence WILL define the future of the universe........ However, it could be that much, including much that "defines" the "future" of the universe is a "constant" that no arrangement of atoms in that universe (no matter how intelligent) can change. If that is the case the future of the universe is pre-ordained no matter how intelligent bits of that universe get.

Finally, to throw a final fillip into the conversation....... suppose that time itself is an illusion caused by an imperfect view of the universe, a view analagous to that of viewing a film (even though it appears to evolve with time it, in fact, is a set progression captured on a block of celluloid that only looks to be progressing becauses of viewing it in a special manner)...... that the universe itself is, in fact, a 4 dimensional "block of ice" with each slice of the block being the universe in 3D at each point in time, like a film captured, and we can only see the "projection" not the underlying recording it is projected from. If that is the case...... nothing can change from that pre-ordained path.

For sure, that "block of 4D ice" contains superhuman intelligence........ but that intelligence would no more define the future than an air bubble trapped in the left most part of a real block of ice defines the structure of the right most part.

TGP

#39 RighteousReason

  • Topic Starter
  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 01 September 2006 - 09:06 PM

What do you mean "future of the universe"?

I mean, say, think about what you imagine the future might be like, or what you want your future to be like, and then take into account a superintelligent 'really powerful optimization process'. Once you consider that notion, all your suppositions/volitions about the future are engulfed and completely dominated by the optimization power directed toward the supergoal of the superintelligence.

Like, us humans are trying to change the Universe around us in various ways, as it is modelled by our goal system, but regardless of what goal states we cause the Universe around us to tend towards, the goal states of a superintelligence have so much more optimization power behind them that any other goal states are basically moot.

What I am saying has nothing to do with some 'ultimate future' of the Universe, has nothing to do with 'breaking the limits' of the Universe, and has nothing to do with "blocks of 4D ice".

#40 Richard Leis

  • Guest
  • 866 posts
  • 0
  • Location:Tucson, Arizona

Posted 02 September 2006 - 03:07 AM

I see no reason to assume that posthumanity will want to change the Universe in any way. It is at least as likely that they will be content to let the Universe unfold around them as it will, with little or no intervention. That we attempt to manipulate our dust mote corner of the Universe suggests only that humanity is young.

A sufficiently advanced civilization may, through miniaturization and other improvements, eventually appear to vanish from the Universe, in terms of detectability by their resource usage and energy output. This apparent vanishing is a potential solution to Fermi's Paradox and may be a more compelling supergoal than one of superintelligence, or a complementary supergoal - Just how intelligent can I become using the least resources possible? Although humanity has certainly not obtained this goal, I believe we can already see this goal taking shape.

Stated another way by paraphrasing one of your statements: All my suppositions/volitions about the future are engulfed and completely dominated by the optimization power directed toward the supergoal of the superintelligence AND superefficiency.

My real goal is to become so enhanced that I can physically - most likely in microscopic or smaller form - traverse the universe as an explorer and observer. Superintelligence is one step toward this goal. My desire to manipulate the Universe is limited to obtaining this goal.

#41 RighteousReason

  • Topic Starter
  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 02 September 2006 - 03:29 AM

*excessively angry post removed*

conversation remains terminated.

Edited by hankconn, 03 September 2006 - 10:32 PM.


#42 Richard Leis

  • Guest
  • 866 posts
  • 0
  • Location:Tucson, Arizona

Posted 02 September 2006 - 03:38 AM

Really? You are saying you nor I cannot determine any part of the future whatsoever? I disagree! Haha. Come on, that statement is ludicrous.


Yes, this is basically what I am saying. Sure, we may determine some of what is going on in our neck of the woods, but for how long? There are likely all sorts of civilizations out there doing the same in their own corners of the universe, yet none of them appear to really be affecting the universe in any measurable way.

Oh, so you have a theory about the desires of entities in the distant future that are drastically more intelligent than you are? Please share! No... don't, because again, this is absurd.


Yes, I do, as you do. You theorize about their supergoal, I counter that superintelligence will not be their supergoal or it will be deprecated by other supergoals.

If the power of moving galaxies means nothing to you, you lack a serious grasp of history and physics. I think such a power, in the hands of anyone, would be something of extreme significance and meaning, at least to me, and surely many more, if only out of the sheer sense of awe that genetically mutated apes could have made it so far without going extinct (and surely for more practical reasons, quite obviously).


The ability to move galaxies means nothing, unless there is some compelling reason to do so. We are not solar system spanning entities, let along universe spanning. By the time we obtain such abilities, we will have no interest in moving galaxies, or doing much manipulation at all.

Any meaning you find in such a feat is relative. Maybe to you, or maybe to all of humanity such a feat might be meaningful. But there are millions more species yet on this planet, a future that will likely to be full of new intelligences, and a likely multitude of intelligences elsewhere in the Universe. They might all find different meaning in their existence and their capabilities. I feel no reason to limit myself to human-centric ideas of meaning except to occasionally stroke my ego.

I am also sorry that you feel this way. I derive a great amount of personal meaning and significance in my own, yours, and every other sentient beings' existence.


There is no reason to feel sorry that I feel this way. Rejecting religion and accepting my meaningless existence in this Universe was the most profound and freeing experiences of my life. It also provided the most terrifying fear in my life: that of falling up (symbolizing the eternity of existence in a vast Universe versus the trivial existence trapped by gravity on a planet).

Cause and effect. You are lost in your words... every event has a consequence.


Every event has a local consequence. I'm not sure how events on our scale compare with events at the universal scale.

A Universe where humans transcend to posthumanity is the pinnacle of all human intents and purposes. That Universe is not this Universe, here and now.


It is the pinnacle of all human intents and purposes, only. How this pinnacle might compare to other pinnacles is questionable.

The first AI is the all that matters- be it our death or salvation.


It is all that matters to humans that feel this is their death or salvation. In any measure of the "scheme of things" I doubt the death or salvation of humanity ranks very high.

I have selfish reasons for wanting to live forever, for wanting to see the development of AGI and other technologies, for wanting to experience the Technological Singularity, for wanting to survive as posthuman. It suits my purposes to also pursue these goals for all of humanity and the other lifeforms on this planet, as it is no skin off my nose if every living entity and sentient being has a right to pursue physical immortality and enhancement.

These developments must be put in perspective, however, and we must eventually realize that the universe is not human-centric. It seems likely to me that lifeforms and sentient beings on other dust motes have, are, and will experience a similar trajectory toward superintelligence, hyperefficiency, and physical immortality. I will thrill to our own progress as an active individual, but I will keep our progress in perspective.

#43 Richard Leis

  • Guest
  • 866 posts
  • 0
  • Location:Tucson, Arizona

Posted 02 September 2006 - 03:45 AM

This convesation is terminated due to an overwhelming response of complete fu.cking idiots. It would be an embarassment for me to continue wasting my time talking to morons like you.


Since you were born 1 January 1907, I assumed you would have more patience with us. I apologize for wasting your time and hope you find more satisfactory responses elsewhere.

#44 RighteousReason

  • Topic Starter
  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 22 February 2009 - 11:53 PM

omfg. Haha... I just found this thread again. I think its hilarious that I Google "ejaculation cryptonomicon" and it comes up with this thread ^ ^

Edited by advancdaltruist, 23 February 2009 - 12:55 AM.


#45 RighteousReason

  • Topic Starter
  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 23 February 2009 - 12:08 AM

Of course, Eliezer made the point I was trying to make in this thread a million times more eloquently way before I ever thought of it. This seems to happen to me a lot.

I still believe, however, that the Singularity represents an optimal fixed point of altruistic philosophy. That is, if you're a rational altruist of any kind, then as your worldview converges toward truth, your actions should converge toward working on the Singularity. It is simply the most effective way to accomplish good, for most known definitions of "good." I also think it's important to recognize that even if your quest is to find out what "good" really is, the Singularity is the most effective way to accomplish that as well. It's a fixed point of both philosophical questioning and philosophical altruism.


Edited by advancdaltruist, 23 February 2009 - 12:16 AM.


#46 RighteousReason

  • Topic Starter
  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 23 February 2009 - 12:36 AM

Rejecting religion and accepting my meaningless existence in this Universe was the most profound and freeing experiences of my life.

You are another one of these people who have no idea what "meaning" means.

there is no meaning in life

That sounds like a personal problem to me.

Well, "problem" is commonly defined as something having a negative effect on you. Going by that popular definition, no. I can live freely and happily knowing that there is most probably no meaning. The same way atheists can live happily knowing there is most probably no god.

I don't mind dying but prefer living, as it enables me to wittness change and acquire knowledge - the only ideals resembling "meaning". :)

Just because there is no universal, ontological, God-given: purpose in life, distinction between good and bad, designation of significance/importance, etc, doesn't mean these things don't exist. You would be hypocritical to even suggest it, because you surely define all of these and use your definitions on a regular basis in both practical, day to day actions and intentions, as well as in higher level long-term philosophies and goals.

What do you mean by "meaning" such that it doesn't exist?

You have sucked all the meaning out of the word "meaning". Why even use it?


When you talk about "meaning" it's like you redefine the word as having a personal definition in the negative, e.g. "that which is absent because I believe there is no god"

It's kind of sad, because that's not what "meaning" means to everyone else. It's like you've created post-religious blinders for yourself, so you can find meaning in life while telling yourself that isn't really what you are doing.

sponsored ad

  • Advert

#47 RighteousReason

  • Topic Starter
  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 24 February 2009 - 03:17 AM

And for Christ's sake don't toss around statements about my supposed beliefs. My beliefs are exactly reality. Coincidentally, my state of knowledge/volition about my beliefs is never entirely complete or consistent at any given time.

One of my more awesome quotes. :)

To quote Anissimov's article originally referenced in this thread:

Rationality is the art of holding internal beliefs that correspond as closely as possible to the objects and patterns of external reality. Rationality is the idea that any goal we want to accomplish can be greatly furthered by improving the quality and accuracy of our thoughts, rather than engaging in wishful thinking, faith-based belief, foolish overestimation, blind confidence, and the like. Rationality is not necessarily at odds with love or emotion, but can work in parallel with them to achieve the maximum in personal fulfillment, effectiveness, and sanity. Rationality is also a force that helps us effectively arrange our life goals into a coherent framework, and create realistic plans for completing those goals. It's hard to say exactly how much rationality is to thank for how far the human race has come since our origin, but it's likely that rationality (or approximations to it) is responsible for the bulk of it.


http://www.accelerat...ritarianism.htm

This is a really awesome article, by the way. Everybody go read it, right now!

Edited by advancdaltruist, 24 February 2009 - 03:20 AM.





1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users