• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
* * * * * 2 votes

If you could augment your intelligence, how far would you go?


  • Please log in to reply
112 replies to this topic

Poll: If you could augment your intelligence, how far would you go? (239 member(s) have cast votes)

How much would you augment your intelligence?

  1. Actually, I want to reduce my intelligence. Knowledge is a burden (1 votes [0.42%])

    Percentage of vote: 0.42%

  2. No thanks, I'm smart enough as it is (5 votes [2.09%])

    Percentage of vote: 2.09%

  3. I'd make myself as smart as Einstein (9 votes [3.77%])

    Percentage of vote: 3.77%

  4. I'd give myself super human intelligence but I'd keep some limitations so I could still be challenged (29 votes [12.13%])

    Percentage of vote: 12.13%

  5. I'd become smart enough to understand every concept that can be understood (176 votes [73.64%])

    Percentage of vote: 73.64%

  6. other (explain) (19 votes [7.95%])

    Percentage of vote: 7.95%

Vote Guests cannot vote

#1 cyborgdreamer

  • Guest
  • 735 posts
  • 204
  • Location:In the wrong universe

Posted 28 February 2008 - 11:32 PM


Suppose artificial neurons were invented that could connect with your brain to make you as intelligent as you wanted. Assuming it was 100% safe and guaranteed to preserve your consciousness and personal identity, would you do it? And, if so, how much?

#2 forever freedom

  • Guest
  • 2,362 posts
  • 67

Posted 29 February 2008 - 12:14 AM

I definitely would become as intelligent as possible. Not even considering the personal stuff, if i decided not be as smart as possible, others who chose to get as intelligent as they could would dominate other "lesser" beings. I wouldn't want to be dominated.


And if the question didn't concern other people around me, that is, only i could choose to increase my intelligence but others couldn't, i would still pick the choice of becoming as smart as possible for the simple fact that it would give me more chances surviving from all dangers that surround us, not only on earth but on the universe too. Once i got to the point where nothing on the universe could harm me, then i would, just for fun, temporarily "disable" a percentage of my intelligence, probably in a virtual reality, and have some fun.
  • like x 3

sponsored ad

  • Advert

#3 Cyberbrain

  • Guest, F@H
  • 1,755 posts
  • 2
  • Location:Thessaloniki, Greece

Posted 29 February 2008 - 12:41 AM

I'd become smart enough to understand every concept that can be understood and beyond.

#4 Liquidus

  • Guest
  • 446 posts
  • 2
  • Location:Earth

Posted 29 February 2008 - 08:04 PM

I think knowledge if empowering, being able to grasp a difficult or abstract concept is empowering, because of that, I would want to be able to understand every concept and beyond. There are undoubtedly concepts that are too advanced for our normal human brains to understand, but augmentation of the brain could help.

I think contemplation of the universe alone shows that even at the most extreme levels of intelligence, there are still challenges to be overcome, and there will always be challenges.

#5 JonesGuy

  • Guest
  • 1,183 posts
  • 8

Posted 29 February 2008 - 09:00 PM

I figure that I'd be able to afford at least one star system's worth of substrate. By the time we're mobile enough to get our own starships, there will likely be a lot larger population.

Though if you're quick and aggressive, you could probably incorporate an entire galaxy before you run into competition.

#6 cyborgdreamer

  • Topic Starter
  • Guest
  • 735 posts
  • 204
  • Location:In the wrong universe

Posted 29 February 2008 - 09:00 PM

I think contemplation of the universe alone shows that even at the most extreme levels of intelligence, there are still challenges to be overcome, and there will always be challenges.


I think, if you could really understand all of the concepts in existence, you'd be essentially omniscient. You instantly forsee any future problems and their solutions (assuming they were solvable at all).

#7 cyborgdreamer

  • Topic Starter
  • Guest
  • 735 posts
  • 204
  • Location:In the wrong universe

Posted 29 February 2008 - 09:12 PM

I figure that I'd be able to afford at least one star system's worth of substrate. By the time we're mobile enough to get our own starships, there will likely be a lot larger population.

Though if you're quick and aggressive, you could probably incorporate an entire galaxy before you run into competition.


Here's a thought. If limited amounts of matter became a problem, we could pool all of it into a universally accessible mega-knowledge base that people could access remotely through their own brains (with some sort of chip, maybe). You'd keep your memories and personality in your own brain but everyone would be able to access all of the universe's conceptual knowledge on demand without competition for substrate. Of course, the light speed barrier might become a problem if you get really far away from the thing.

#8 Grimm

  • Guest
  • 92 posts
  • 4
  • Location:America

Posted 01 March 2008 - 01:20 AM

I'm fine, thanks.

#9 jackinbox

  • Guest
  • 452 posts
  • 4

Posted 01 March 2008 - 02:12 AM

I think it's somewhat silly to say that we want unlimited intelligence. We cannot, by definition, understand how being that intelligent would feel. Without challenges, life would be terribly boring. It would be impossible to keep our current personnality because our personnality is in part defined or influenced by our knowledge and our intelligence.

Scott Adams develop an interesting theory in his book "God's debris". Since God was able predict the future, he was very bored. So, he decided to blow itself up to see what will happen next. That was the Big Bang. It's not a concept I believe in but it's quite interesting.

I will let Marvin have the final word:
"Here I am, brain the size of a planet, and they get me to take you down to the bridge. Call that job satisfaction? 'Cos I don't."

#10 modelcadet

  • Guest
  • 443 posts
  • 7

Posted 01 March 2008 - 09:53 AM

In the poll, answer no. 1 and no. 3 are the same. You might wanna change that. :p

#11

  • Lurker
  • 0

Posted 01 March 2008 - 10:17 AM

It is difficult to say.

I want to become much much much much much more smarter than I am now, that's for sure, but I can't say how far I would go since none of us have any idea of what the ramifications of having such intelligence even are.

#12 cyborgdreamer

  • Topic Starter
  • Guest
  • 735 posts
  • 204
  • Location:In the wrong universe

Posted 01 March 2008 - 04:11 PM

I think it's somewhat silly to say that we want unlimited intelligence. We cannot, by definition, understand how being that intelligent would feel. Without challenges, life would be terribly boring. It would be impossible to keep our current personnality because our personnality is in part defined or influenced by our knowledge and our intelligence.


How you feel depends at more on your emotion systems than on your intelligence. We need challenges now because we are only capable of being interested in things that challenge us. However, this doesn't necessarily have to be so. We could, theoretically augment our intelligence while maintaining the feeling engagement with whatever interests us now.

As for whether our personality would be the same: that depends on how you choose to define your identity. Personally, I don't want to define myself by my limitations. As long as I had the same interests and values, and I wasn't less intelligent then I am now, I'd consider myself the same person.

#13 AdamSummerfield

  • Guest
  • 351 posts
  • 4
  • Location:Derbyshire, England

Posted 01 March 2008 - 04:30 PM

I voted "other (explain)":

I think I will upgrade my intellect as soon as intelligence-modifying components are agreed as safe to use. I will upgrade to become as intelligent as possible. But I will not however tamper with my emotions using super-drugs, or anything of the sort, since I believe meditation is the best way to do this. That way I am not reliant on the drug, my emotions of compassion are therefore entirely self-produced. Also, I think I would prefer to continue reading, rather than have information downloaded directly into my brain, this my change though. I wish to develop wisdom and compassion on my own, and leave boosting my information capacity and learning capabilities to new technologies.

- Adam

#14 Brainbox

  • Member
  • 2,860 posts
  • 743
  • Location:Netherlands
  • NO

Posted 01 March 2008 - 07:47 PM

I'm fine as it is.

#15 maestro949

  • Guest
  • 2,350 posts
  • 4
  • Location:Rhode Island, USA

Posted 01 March 2008 - 08:33 PM

Rather than just more intelligence, I'd like to be able to be able to process multiple streams of consciousness simultaneously. Hundreds, if not thousands of processing threads would be nice. Engineering just about anything, regardless of complexity, size and speed would become a breeze.

jackinbox raises a good point though. Once you obtain more intelligence your outlook might change.


Edit: Typo

Edited by maestro949, 01 March 2008 - 08:34 PM.


#16 gashinshotan

  • Guest
  • 443 posts
  • -2

Posted 01 March 2008 - 11:02 PM

I'd connect myself to a network of other minds and control them.

#17 cyborgdreamer

  • Topic Starter
  • Guest
  • 735 posts
  • 204
  • Location:In the wrong universe

Posted 02 March 2008 - 12:19 AM

I'd connect myself to a network of other minds and control them.


Don't you think there would be safeguards against that? I definitely wouldn't connect if it would make me vulnerable to mind control.

#18 jackinbox

  • Guest
  • 452 posts
  • 4

Posted 02 March 2008 - 02:13 AM

I would prefer a better memory than an improved intelligence. Knowledge is wisdom!

#19 cyborgdreamer

  • Topic Starter
  • Guest
  • 735 posts
  • 204
  • Location:In the wrong universe

Posted 02 March 2008 - 03:09 AM

I would prefer a better memory than an improved intelligence. Knowledge is wisdom!


Why not both?

#20 gashinshotan

  • Guest
  • 443 posts
  • -2

Posted 02 March 2008 - 04:10 AM

I'd connect myself to a network of other minds and control them.


Don't you think there would be safeguards against that? I definitely wouldn't connect if it would make me vulnerable to mind control.


Of course I would do it without being detected, and it could be subtle control, through images and movies during sleep, and eventually total psychological and mechanical control of the link.

#21 cyborgdreamer

  • Topic Starter
  • Guest
  • 735 posts
  • 204
  • Location:In the wrong universe

Posted 02 March 2008 - 03:49 PM

I'd connect myself to a network of other minds and control them.


Don't you think there would be safeguards against that? I definitely wouldn't connect if it would make me vulnerable to mind control.


Of course I would do it without being detected, and it could be subtle control, through images and movies during sleep, and eventually total psychological and mechanical control of the link.


What you're imagining sounds more like a hive mind than a knowledge base. I was thinking that it would be something you could read but not modify and that you could access on demand but wouldn't affect your brain without your knowledge and consent.

#22 maestro949

  • Guest
  • 2,350 posts
  • 4
  • Location:Rhode Island, USA

Posted 02 March 2008 - 04:10 PM

I'd connect myself to a network of other minds and control them.


Actually this doesn't sound that far from what the powers that be try to do with all of the existing media outlets. Whether it be "Buy our Products", "Hate Liberals", or "Believe in our Moral Codes and Follow Our Rituals and You'll be Saved", the name of the game is dictating behavior through viral memes.

Don't you think there would be safeguards against that? I definitely wouldn't connect if it would make me vulnerable to mind control.


But you are connected... :p

Edited by maestro949, 02 March 2008 - 04:11 PM.


#23 mentatpsi

  • Guest
  • 904 posts
  • 36
  • Location:Philadelphia, USA

Posted 07 March 2008 - 01:39 AM

The idea of understanding all concepts seems impossible, there's always something else that will result from knowledge... personally i worry about the idea of becoming too smart and then getting assassinated or abducted for examination... i don't really trust people of power enough to go way too high above the herd announced, but if I'd be able to keep it silent and occasionally (making it look difficult) release award winning research in all areas of science I'd go to the top in a heart beat... My god this sounds like Dune lol. The spice must flow... :)

#24 Utnapishtim

  • Guest
  • 219 posts
  • 1

Posted 15 March 2008 - 09:43 PM

I have no interest in radical intelligence enhancement. A better memory would be nice, thats about it. Were I to increase my intelligence even to the current outer edge. (IQ 200 or so I believe) my inner life would be so dramatically different from the current one that there would be little to connect the two people. My interests abd hobbies, relationships and social network, experiences and ambitions, my entire psychological landscape is shaped by my current intelligence level. Were I to increase my intelligence to the point where my entire previous life experience was rendered irrelevant, then I the person I am today would be essentially dead.
  • like x 1

#25 dr_chaos

  • Guest
  • 143 posts
  • 0
  • Location:Vienna

Posted 15 March 2008 - 09:55 PM

Since I can't remember a day when I did not hit my mental limits. I chose I'd become smart enough to understand every concept that can be understood.

I'd connect myself to a network of other minds and control them.

And of course I'd strive to dominate the universe after that :)

#26 JonesGuy

  • Guest
  • 1,183 posts
  • 8

Posted 15 March 2008 - 10:09 PM

Here's a thought. If limited amounts of matter became a problem, we could pool all of it into a universally accessible mega-knowledge base that people could access remotely through their own brains (with some sort of chip, maybe). You'd keep your memories and personality in your own brain but everyone would be able to access all of the universe's conceptual knowledge on demand without competition for substrate. Of course, the light speed barrier might become a problem if you get really far away from the thing.

Yeah, we're on the same path.

I think that we enjoy 'expanded real time thought' too much. Instead of turning a distant star into a communal calculation machine, I think it would better to just incorporate an entire star into my mind. That way I can have maximum consciousness. And then other people could have other stars, and we could continue to trade thoughts (with hellishly slow delays) like we do today.

#27 Shannon Vyff

  • Life Member, Director Lead Moderator
  • 3,897 posts
  • 702
  • Location:Boston, MA

Posted 16 March 2008 - 03:02 AM

I'm not too surprised that most here want to know every concept knowable...

#28 jackinbox

  • Guest
  • 452 posts
  • 4

Posted 16 March 2008 - 04:01 PM

This is pretty much a blue pill red pill question...

#29 Sozin

  • Guest
  • 22 posts
  • 1
  • Location:Connecticut

Posted 19 March 2008 - 09:27 PM

This is pretty much a blue pill red pill question...



Haha so true. Anyway I chose other just because I was actually unsure of what to choose. Gaining the superintelligence described is kind of like using cheat codes in Quake. After you play around with the cheat codes for a while the game becomes boring. But then again I think that there will always be something to learn in the universe. For example: did you know that women blink about twice as often as men do?

sponsored ad

  • Advert

#30 Ghostrider

  • Guest
  • 1,996 posts
  • 56
  • Location:USA

Posted 20 March 2008 - 05:08 AM

After getting the s**t kicked out of my by a hard final exam...yeah, I want to become as smart as possible without becoming unstable.
  • like x 1




1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users