• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
* * * * * 2 votes

If you could augment your intelligence, how far would you go?


  • Please log in to reply
112 replies to this topic

Poll: If you could augment your intelligence, how far would you go? (239 member(s) have cast votes)

How much would you augment your intelligence?

  1. Actually, I want to reduce my intelligence. Knowledge is a burden (1 votes [0.42%])

    Percentage of vote: 0.42%

  2. No thanks, I'm smart enough as it is (5 votes [2.09%])

    Percentage of vote: 2.09%

  3. I'd make myself as smart as Einstein (9 votes [3.77%])

    Percentage of vote: 3.77%

  4. I'd give myself super human intelligence but I'd keep some limitations so I could still be challenged (29 votes [12.13%])

    Percentage of vote: 12.13%

  5. I'd become smart enough to understand every concept that can be understood (176 votes [73.64%])

    Percentage of vote: 73.64%

  6. other (explain) (19 votes [7.95%])

    Percentage of vote: 7.95%

Vote Guests cannot vote

#31 bacopa

  • Validating/Suspended
  • 2,223 posts
  • 159
  • Location:Boston

Posted 20 March 2008 - 06:35 AM

I chose smart enough to understand every known and unknown concept. And I agree with the person who said there will always be knowlege out there so life should never get boring...if it does than I would take away some of my genius abilities I think...

#32 dr_chaos

  • Guest
  • 143 posts
  • 0
  • Location:Vienna

Posted 20 March 2008 - 11:42 AM

I chose smart enough to understand every known and unknown concept. And I agree with the person who said there will always be knowlege out there so life should never get boring...if it does than I would take away some of my genius abilities I think...

I think even if you know everything about a thing you still can enjoy it. Maybe you enjoy it even more then. Take at example riding a bike. If you are good at riding you know pretty much exactly how the bike will behave and what happens if you do something. Still its much more fun riding, doing tricks and playing with it, at the time when you master it than at the time you learn to ride(at least for me). Its not all about acquiring knowledge. Applying it is often much more fun.

sponsored ad

  • Advert

#33 Shannon Vyff

  • Life Member, Director Lead Moderator
  • 3,897 posts
  • 702
  • Location:Boston, MA

Posted 21 March 2008 - 03:22 AM

Well I'm also sure that while there will always be something to learn, the concept of knowing and understanding will always be different. There are two sides, and a myriad of ranges in between for just about any topic. How could an all knowing being, ever 'pick a side'? ;) Ah, who is to choose the greater good, or the laws of the Universe? But, yes we have a ways to go before we can even address such questions, to ourselves.

#34 gashinshotan

  • Guest
  • 443 posts
  • -2

Posted 21 March 2008 - 03:55 AM

After getting the s**t kicked out of my by a hard final exam...yeah, I want to become as smart as possible without becoming unstable.


No you don't want to do that. A lot of intelligent people are psychopaths and worst.

#35 forever freedom

  • Guest
  • 2,362 posts
  • 67

Posted 21 March 2008 - 04:57 AM

After getting the s**t kicked out of my by a hard final exam...yeah, I want to become as smart as possible without becoming unstable.


No you don't want to do that. A lot of intelligent people are psychopaths and worst.




But being intelligent doesn't necessarily mean the person will become "evil".

#36 gashinshotan

  • Guest
  • 443 posts
  • -2

Posted 21 March 2008 - 05:09 AM

After getting the s**t kicked out of my by a hard final exam...yeah, I want to become as smart as possible without becoming unstable.


No you don't want to do that. A lot of intelligent people are psychopaths and worst.




But being intelligent doesn't necessarily mean the person will become "evil".


But it does mean an evil person can think of evil, viable schemes to fulfill their narcissistic needs and inflict far more damage than mere criminals.

#37 forever freedom

  • Guest
  • 2,362 posts
  • 67

Posted 21 March 2008 - 05:44 AM

After getting the s**t kicked out of my by a hard final exam...yeah, I want to become as smart as possible without becoming unstable.


No you don't want to do that. A lot of intelligent people are psychopaths and worst.




But being intelligent doesn't necessarily mean the person will become "evil".


But it does mean an evil person can think of evil, viable schemes to fulfill their narcissistic needs and inflict far more damage than mere criminals.




That's for sure. Intelligence is an expander of personal power and abilities. The more intelligent the person is, the more power this person has to influence reality and the environment. So of course, more intelligence to an evil person would mean more power to do evil.

#38 phylodome

  • Guest
  • 19 posts
  • 0
  • Location:New Haven, CT

Posted 21 March 2008 - 07:45 AM

I think I will upgrade my intellect as soon as intelligence-modifying components are agreed as safe to use. I will upgrade to become as intelligent as possible. But I will not however tamper with my emotions using super-drugs, or anything of the sort
- Adam


It would seem that the drastic alteration of one's mental faculties to any degree would fundamentally alter the relationship between the complex, emergent phenomena that is consciousness and your more primitive drives and desires. Your emotions, or affective states, are the perceptual artifacts of your limbic system. Your subjective understanding of those evolved substrates hinges greatly on the way that you alter, by way of learning and experience, the architecture of your higher cognitive capacities. By integrating vastly superior wetware, it's naive to think that you will possess the ability to maintain a relationship with your emotional "self" that is similar to the one you have now.

Personally, I look forward to expanding my notion of consciousness to incorporate a networked sense of existence wherein I can still maintain an identity at will, but might also possess the capacity to distribute my mental capacity through decentralized systems, in essence becoming multiple links in other individual's thought processes, and furthermore consciously evolving to incorporate reflections of these other intelligences into my own self-identity. To an extent this is what the net represents in a very impersonal instantiation, but I envision this being an experience that resembles gazing into every conscious being's eyes simultaneously. This would not be for the purposes of seeing their thoughts, but merely sensing the multitude of existing subjective perspectives.

Anyhow, I checked "other"

#39 gashinshotan

  • Guest
  • 443 posts
  • -2

Posted 21 March 2008 - 10:30 AM

After getting the s**t kicked out of my by a hard final exam...yeah, I want to become as smart as possible without becoming unstable.


No you don't want to do that. A lot of intelligent people are psychopaths and worst.




But being intelligent doesn't necessarily mean the person will become "evil".


But it does mean an evil person can think of evil, viable schemes to fulfill their narcissistic needs and inflict far more damage than mere criminals.




That's for sure. Intelligence is an expander of personal power and abilities. The more intelligent the person is, the more power this person has to influence reality and the environment. So of course, more intelligence to an evil person would mean more power to do evil.


i.e. Hitler, Stalin, Mao, Fidel, Sadam, etc.

#40 dr_chaos

  • Guest
  • 143 posts
  • 0
  • Location:Vienna

Posted 21 March 2008 - 03:14 PM

It would seem that the drastic alteration of one's mental faculties to any degree would fundamentally alter the relationship between the complex, emergent phenomena that is consciousness and your more primitive drives and desires. Your emotions, or affective states, are the perceptual artifacts of your limbic system. Your subjective understanding of those evolved substrates hinges greatly on the way that you alter, by way of learning and experience, the architecture of your higher cognitive capacities. By integrating vastly superior wetware, it's naive to think that you will possess the ability to maintain a relationship with your emotional "self" that is similar to the one you have now.

In other words: You change and get a new view on reality. But that's what augmentation is about in the first place, right?

#41 forever freedom

  • Guest
  • 2,362 posts
  • 67

Posted 21 March 2008 - 05:24 PM

After getting the s**t kicked out of my by a hard final exam...yeah, I want to become as smart as possible without becoming unstable.


No you don't want to do that. A lot of intelligent people are psychopaths and worst.




But being intelligent doesn't necessarily mean the person will become "evil".


But it does mean an evil person can think of evil, viable schemes to fulfill their narcissistic needs and inflict far more damage than mere criminals.




That's for sure. Intelligence is an expander of personal power and abilities. The more intelligent the person is, the more power this person has to influence reality and the environment. So of course, more intelligence to an evil person would mean more power to do evil.


i.e. Hitler, Stalin, Mao, Fidel, Sadam, etc.




Yes yes. I just hope you got my point. More intelligence to people in general is good, because there are a lot of good people out there, hopefully more than evil people. So in general we would advance faster as a civilization.

#42 RighteousReason

  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 31 October 2008 - 12:19 AM

Here is an interesting SL4 post. I don't know what to make of it. I do know that Michael Wilson is one of the most brilliant people on the planet. According to him: he still pretty much agrees with everything here, although there should be workarounds, and besides hopefully an FAI could figure this all out for us.

Normative Reasoning: A Siren Song?
From: Michael Wilson (mwdestinystar@yahoo.co.uk)
Date: Sun Sep 19 2004 - 08:02:45 MDT

Though speculation about post-Singularity development trajectories is
usually futile, my recent research has thrown up a serious moral issue
which I believe has important implications for CV. The basic points are
that a normative method of reasoning exists, it and close approximations
thereof are tremendously powerful and any self-improving rational
intelligence (artificial or upload) will eventually converge to this
architecture unless their utility function explicitly prevents this
action.


The problem here is that just about all the human qualities we care about
are actually the result of serious flaws in our cognitive architecture,
and many of these seem to have lossless translations into goal specifications
for a perfectly rational substrate (the basis for Yudkowsky's really
powerful optimisation processes). As humanity self-improves, normative
reasoning (of which appropriately implemented Solomonoff induction is at
the very least a good approximation) is a major attractor; adopting it makes
you as effective as possible at utilising any finite amount of information
and computing power. If there's any sort of competition going on, turning
yourself into a RPOP is the way to win. Unfortunately it also appears to be
the end of most of the stuff we place moral value on. A universe full of
perfect rationalists is a universe where all diversity resides solely in
people's goal systems (which may or may not converge); the qualities of
'insight', 'creativeness', 'willpower' etc will all dissappear as they are
defined against flaws, and goal-system properties such as 'compassion' will
revert to 'did this person have an initial utility function that was
compassionate under renomralisation'? This is on top of the already known
issues with qualia and the illusion of free will; both are results of
specific (adaptive) flaws in human introspective capability which would be
relatively trivial for transhumans to engineer out, but at the cost of
breaking the grounding for the actual (rather than theoretical)
implementation of our moral and legal systems and creating something we can
no longer emphasise with.


The basic question here is 'can we create a Power we can care about?'. A
Yudkowsky RPOP is at least potentially a Power, but it is explicitly
designed to be one we don't care about, as it isn't sentient in a way we'd
assign moral worth to (a decision currently made using our ad-hoc evolved
neural lash-together). What do we need to add to make it volitional, and
what further characteristics would we want to be present in the beings
humanity will become? Are some inherent limitations and flaws actually
necessary in an intelligence order to qualify as something worthwhile?
Less relevantly, is a Nice Place To Live likely to insist that its volitional
sentients have some selection of reasoning flaws in order to create a
diverse and interesting society? This is something of a blow for
rationalists, in that perfect rationality may indeed be hopelessly inhuman,
but isn't there a way to hybridise normative and non-normative reasoning
into a cognitive architecture that is both powerful and morally relevant
(ok, perhaps this is my desire for Cosmic Power coming through :)?


The CV question could be glibly summarised as 'is there a likely incremental
self-improvement path from me to a paperclip optimiser?'. While few people
like paperclips /that/ much, it seems likely that many people would choose
to become perfect rationalists without appreciating what they're losing. If
there is a path to normative reasoning that looks locally good all the way
and reports back that everything is fine when extrapolating, an
implementation of CV that doesn't allow for this may lead us into something
we should consider a disaster.


This issue is ultimately a comprehension gap; a universe of perfect
rationalists might well be rated as valuable inhabitants, but we have no
way of mapping our conception of worthwhile and desirable onto this
basically alien assessment. Along the wild ride that has constituted my
seed AI research to date, my original engineering attitude (focus on
practical stuff that works, everything can be fixed with enough technology)
has had to expand to acknowledge the value of both the abstract (normative
reasoning theory and relevant cosmology) and the humanist (despite all the
hard maths and stuff you have to cover just to avoid disaster, Friendliness
ultimately comes down to a question of what sort of universe we want to
live in).


* Michael Wilson


http://www.sl4.org/b...i.pl?Starglider


http://www.sl4.org/a.../0409/9841.html

Edited by Savage, 31 October 2008 - 02:55 AM.


#43 cyborgdreamer

  • Topic Starter
  • Guest
  • 735 posts
  • 204
  • Location:In the wrong universe

Posted 01 November 2008 - 07:51 PM

It would seem that the drastic alteration of one's mental faculties to any degree would fundamentally alter the relationship between the complex, emergent phenomena that is consciousness and your more primitive drives and desires. Your emotions, or affective states, are the perceptual artifacts of your limbic system. Your subjective understanding of those evolved substrates hinges greatly on the way that you alter, by way of learning and experience, the architecture of your higher cognitive capacities. By integrating vastly superior wetware, it's naive to think that you will possess the ability to maintain a relationship with your emotional "self" that is similar to the one you have now.


I think it would be possible to design intelligence upgrades that would preserve our capacity for emotion. After all, we do seem to share emotions with small-brained animals. If we assume mice are conscious, a mouse's fear probably feels similar to human fear. The only difference is that we can better understand the situations we are afraid of and our minds can (meaningfully) generate the words "I'm scared".

#44 automita

  • Guest
  • 25 posts
  • 0
  • Location:san diego, ca. usa

Posted 03 November 2008 - 10:44 PM

as far as i could go to solve as many problems for humanity. the smarter you are the less you need.

#45 brokenportal

  • Life Member, Moderator
  • 7,046 posts
  • 589
  • Location:Stevens Point, WI

Posted 04 November 2008 - 08:24 PM

Ide want to continue on toward understanding every concept, if along the way I found that limits in certain areas were more useful than not then Ide want to implement that.

Why would any body not want to be able to move toward the understanding of every concept? Not understanding stuff leaves room for mistakes and mistakes leave room for destruction on unforseeable levels ranging from trivial to the end of the universe.

#46 REGIMEN

  • Guest
  • 570 posts
  • -1

Posted 05 November 2008 - 04:55 AM

I'd become as smart as possible to create my own language and culture that logically did not contain any of the malfactors of the current available systems. Then I would breed two generations deep into the future and make sure these offspring had no means for being tainted by the olde garde.

It would look a little like a bunch of high functioning idiots doing hippy commune stuffs but without the patchouli and dirty feet. All dirty feet will be caned. All patchouli will be waved out of face.... because that's all we'll be willing to do about it...get on with things, you know?

We will turn the upper stratosphere into a wind-turbine meat grinder bringing us the freshest seasonal birds when they so choose to feed themselves to us.

We will blind all bees and our shorter brethren with our rampant application of solar panels (panel tilt angle).

We might as well ranch dolphins sine they have such large brains, livers, and kidneys which are far more efficiently nutritious than any other food on earth.

Cars. We got 'em. But they run on seawater. The fumes are captured as potable drinking water that we barter to inland fruitstand owners for some of those nearly extinct Polaroid photos of our trip destination: Inland Fruitstand. Wait, freshwater...? Who cares, it's a Polaroid! A pity all teh salt we drop alongside the fruittree groves to get there...

We'll laugh and galumpfph and trip because we don't know what dancing is and who really cares when the lights are all on? Everyone looks like as ass when they're dancing so we've won: loss of self-consciousness.

Corduroy is the national dress code for friction-harvesting technologies. Dance party shinkansen...all the way there. We got dolphin in our bellies, we can handle.

Anything I've forgotten? Oh that's right...all of it.

Edited by REGIMEN, 05 November 2008 - 05:05 AM.


#47 mpe

  • Guest, F@H
  • 275 posts
  • 182
  • Location:Australia

Posted 06 November 2008 - 06:43 AM

I'd just like to be smart enough to win the occ argument with my wife and or daughter

#48 vyntager

  • Guest
  • 120 posts
  • 2

Posted 06 November 2008 - 11:03 PM

Ide want to continue on toward understanding every concept, if along the way I found that limits in certain areas were more useful than not then Ide want to implement that.

Why would any body not want to be able to move toward the understanding of every concept? Not understanding stuff leaves room for mistakes and mistakes leave room for destruction on unforseeable levels ranging from trivial to the end of the universe.



That's a certainty, and on a practical level, we will need to have as many powerful means as possible at our disposal, if we are to survive our future.

But as I disagree with the idea that unweaving the rainbow doesn't make it any less beautiful (rather, it unveils a different kind of beauty, which may well be uncompatible with the previous beauty we perceived in that rainbow), I think that there's a pretty evident reason why you one wouldn't want to grow intelligent beyond a certain point : because the being resulting from that vastening wouldn't be you anymore. It's a kind of death.

Now of course if nothing prevents you from duplicating yourself and having a part of you growing to post singularity intelligence, and the other(s) remaining as you are now (or as you will be along the path, as many of them as you need to snapshot all the significantly different versions of yourself), then this fear is as good as dispelled.

People don't just exist as problem solvers, though they have to be problem solvers before anything else, by necessity. They could certainly also enjoy idle time as mere human beings. It's a valuable state of being in itself.

#49 Korimyr the Rat

  • Guest
  • 48 posts
  • -1

Posted 03 December 2008 - 03:54 AM

No limits. If I start getting bored, I'll find bigger challenges.

#50 yipe

  • Guest, F@H
  • 32 posts
  • 0

Posted 31 December 2008 - 06:52 PM

I would DEFINITELY become as smart as the new praxis would allow me. Ramp that thing all the way up to 100% or whatever it is. Someone mentioned the possibility of the super-super-intelligent becoming unstable. That certainly seems to be something that occasionally happens to our own super-intelligent people. However I wouldn't mind this risk, for all we know these mental giants might actually be acting perfectly rationally, but according to some set of circumstances that our puny minds are completely unaware of! Besides, I can only assume that at least some of these titans of thought will be doing SOMETHING that makes life better or at least more interesting and entertaining for the unwashed masses below them, so the rabble can put up with some late-night insane cackling or random "test" explosions if it means they get a better jeejah or whatsit. Most would put up with it to just get a little better quality of TV show.

#51 valkyrie_ice

  • Guest
  • 837 posts
  • 142
  • Location:Monteagle, TN

Posted 22 February 2009 - 08:15 AM

the problem of infinite intelligence is that if everyone knows everything and understands everything and has the exact same level of knowledge in every aspect of every field, then you no longer have individuality.

Lets say you do have a brain the size of a star. You know everything. You understand everything. You are in contact with every other mind in existance and share data simultaniously with all of them.

Why?

That person you share data with has the exact same data as you. Since you share all data, even differences between your locales in the universe are meaningless, because you will have already shared such differences and analysed it completely. Logically they are your equal and so any conclusions they draw will be identical to yours, they will have no areas in which there is any difference between "you" and "them". In essence talking to "them" is identical to sitting and thinking, but what is there really to think about? You know everything!

This is the end result of the rationalistic ideal. Personally I view it as non existance.

Do I want higher intellegence? Yes. Do I want to maintain my humanity while having it? You betcha. I don't want to control others, because if I did, where is the potential for interaction? How can I talk to them, learn new stuff from them, be forced to THINK by them when *I* AM THEM???

If I were all alone, I'd want to clone myself a few dozen times, with gender, color, and size variations, probably even differences in whether all of us are even fully human or mixed anthropomorphs, simply to give myself someone to talk to who DID NOT HAVE THE EXACT SAME VIEWPOINT as myself. Even if we started out similar, our difference would grow over time.

So I suppose you could say higher with limitations, because limitations leaves potential for growth. Is it a risk? You betcha, but when the alternative is eternal stasis, I'll take my chances thank you very much.

#52 Infernity

  • Guest
  • 3,322 posts
  • 11
  • Location:Israel (originally from Amsterdam, Holland)

Posted 02 March 2009 - 02:19 PM

Understand every concept.....


I want to do that. I live to learn and learn to live. Once you know everything - you know how to live forever and once you life vorever - you have enough time to learn everything.


I do want to be able to understand it all, it has to be a challenge.

#53 Ben Simon

  • Guest
  • 352 posts
  • 3
  • Location:London

Posted 02 March 2009 - 04:09 PM

The thing we must remember is that our values (and in fact our very reasons for living) are formed in an evolutionary context. Much of morality for example is intimately connected to our aesthetic sensibilities, which are connected to our fear of things which are 'impure', which confers a survival advantage by directing us to avoid certain dangers. At some point tampering with things like intelligence becomes tampering with our nature. 'We' cease to exist, and the thing that has taken our place could very easily (I believe) find its intellect unhealthily out of proportion with its other sensibilities. I'm all for augmentation in principle, but we could be on a path toward existential meaninglessness. Food for thought.

Edited by ben, 02 March 2009 - 04:10 PM.


#54 Anonymous

  • Guest
  • 8 posts
  • 0

Posted 24 June 2009 - 02:48 AM

I'd become smart enough to understand every concept that can be understood.

Think about that, if would live for 10,000+ years, we can learn so much things!! It could be boring if everyone would augment his intelligence. :~

Edited by Anonymous, 24 June 2009 - 03:23 AM.


#55 Athanasios

  • Guest
  • 2,616 posts
  • 163
  • Location:Texas

Posted 24 June 2009 - 04:41 AM

I think the more important question is how fast would you increase your intelligence. This would plug into Fun Theory.

#56 Reno

  • Guest
  • 584 posts
  • 37
  • Location:Somewhere

Posted 24 June 2009 - 07:13 AM

I chose other. Knowing the best course of action for every situation would make life extremely predictable and boring. If every person around you has the same problem then there is no free will. Everyone knows what everyone else will do and lives life appropriately.

I would give myself photographic memory and maybe increase my ability to reason 20-30 iq points. That would be more then enough to satisfy my needs.

There have been a lot of books that talked about this subject. Everyone here should read some larry niven. He wrote a book called Protector which is about what happens to humans when they enter the 4rth development stage of human life. I think he won the nebula award once or twice for his ring world series.

Edited by bobscrachy, 24 June 2009 - 07:14 AM.


#57 Mr. Jingles

  • Guest
  • 30 posts
  • 2

Posted 24 June 2009 - 04:49 PM

I chose other, so I will explain.

First it depends on the track record of the procedure. I certainly will not volunteer to be first. The risks for having everything you were being overwritten, in essense dying, are too great. I want to maximize what I have been given so far before embarking on something more, anyway. I've still got such a long way to go. I feel inferior in many ways. (It's not just a feeling. I know conclusively that I am deeply inferior--not merely to others, but to what I could be (naturally).

Mental superaugmentation may become a necessity if the augmented are so vastly superior as to enable them to collectively dominate all others on earth.

But how would that be different from a runaway superAI? Maybe we would need some heroes to step forward to keep the AIs in check, while ensuring that the ultraintelligent entities still have humanity. I get so jaded about the weaknesses of humans, but my innate love of people is so strong...

If we live for thousands of years, mental augmentation may be a necessity. Imagine being in the stone age among modern people.

#58 waldemar

  • Guest
  • 206 posts
  • 0

Posted 24 November 2009 - 05:27 PM

As smart as possible of course.

http://en.wikipedia....wisatz_Haderach
http://en.wikipedia....i/Pak_Protector

#59 David Styles

  • Life Member
  • 512 posts
  • 295
  • Location:UK

Posted 06 December 2009 - 02:03 PM

No you don't want to do that. A lot of intelligent people are psychopaths and worst.


A lot of unintelligent people are also psychopaths or worse.

In my professional life, I have seen sociopathy (which is, I presume, what you mean) in adults with learning disabilities. We're talking really, really low intelligence here, yet still all the same behavioural markers. The only difference is that a certain level of intelligence is generally required to be dangerous to society.

That said, I'd wager that a sociopath with intelligence above a certain level would be intelligent enough to not go on mass-murdering sprees, etc. The kind of person colloquially known as the "white collar psychopath".

sponsored ad

  • Advert

#60 bacopa

  • Validating/Suspended
  • 2,223 posts
  • 159
  • Location:Boston

Posted 06 December 2009 - 02:55 PM

I agree with the general consensus of the SL4 argument that normative rational thinking although possibly reaching levels of perfection, might be terribly boring, and I also agree with Valkryie in that if everyone had the same level of intellect, individuality would cease to exist, which overlaps with the SL4 argument, of course.

I would vote to know every known concept, somehow incorporate fun theory in there by either slowing down my knowledge accumulation learning process, or just using set knowledge and my intellect to do cool and fun things, that I couldn't really speculate on being in this primitive stage in the game.

I would think throwing in ideas like will power, competition, even deleterious and maybe slightly irrational mood states might benefit this intelligence, just as long as it doesn't become dangerous or sociopathic. I guess I also agree with David Styles in this sense and his argument that low intellect can yield psychopathic tendencies as well. Also mental illness is a contributor as well, and I'm pretty sure Hitler was mentally ill on some or many levels. However I do think the American psycho stereotypical psychopath would be far more dangerous than a more impaired mind with the same tendencies.

Finally this overlaps with Portals argument that imperfection leads to terrible often catastrophic mistakes, and you can well imagine some of the scenarios. I would like to build my intelligence, (assuming I would augment it with nano, or pycho technology, and whatever else the future would yield), to have iterative gradients of well being, (as David Peace comments,) whilst also knowing every known concept...somehow the combined two could yield Yukowskies vision of true fun theory.

I also agree with Vyntager's argument that even though something will be lost as we morph from human to post human intellect the new intellect will be just as exciting, probably exponentially more so, and we probably wouldn't miss our old brains, just as long as we can incorporate some kind of emotional reasoning...or simply put feelings.

Edited by dfowler, 06 December 2009 - 03:06 PM.





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users