Suppose artificial neurons were invented that could connect with your brain to make you as intelligent as you wanted. Assuming it was 100% safe and guaranteed to preserve your consciousness and personal identity, would you do it? And, if so, how much?
Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.
If you could augment your intelligence, how far would you go?
#1
Posted 28 February 2008 - 11:32 PM
Suppose artificial neurons were invented that could connect with your brain to make you as intelligent as you wanted. Assuming it was 100% safe and guaranteed to preserve your consciousness and personal identity, would you do it? And, if so, how much?
#2
Posted 29 February 2008 - 12:14 AM
And if the question didn't concern other people around me, that is, only i could choose to increase my intelligence but others couldn't, i would still pick the choice of becoming as smart as possible for the simple fact that it would give me more chances surviving from all dangers that surround us, not only on earth but on the universe too. Once i got to the point where nothing on the universe could harm me, then i would, just for fun, temporarily "disable" a percentage of my intelligence, probably in a virtual reality, and have some fun.
sponsored ad
#3
Posted 29 February 2008 - 12:41 AM
#4
Posted 29 February 2008 - 08:04 PM
I think contemplation of the universe alone shows that even at the most extreme levels of intelligence, there are still challenges to be overcome, and there will always be challenges.
#5
Posted 29 February 2008 - 09:00 PM
Though if you're quick and aggressive, you could probably incorporate an entire galaxy before you run into competition.
#6
Posted 29 February 2008 - 09:00 PM
I think contemplation of the universe alone shows that even at the most extreme levels of intelligence, there are still challenges to be overcome, and there will always be challenges.
I think, if you could really understand all of the concepts in existence, you'd be essentially omniscient. You instantly forsee any future problems and their solutions (assuming they were solvable at all).
#7
Posted 29 February 2008 - 09:12 PM
I figure that I'd be able to afford at least one star system's worth of substrate. By the time we're mobile enough to get our own starships, there will likely be a lot larger population.
Though if you're quick and aggressive, you could probably incorporate an entire galaxy before you run into competition.
Here's a thought. If limited amounts of matter became a problem, we could pool all of it into a universally accessible mega-knowledge base that people could access remotely through their own brains (with some sort of chip, maybe). You'd keep your memories and personality in your own brain but everyone would be able to access all of the universe's conceptual knowledge on demand without competition for substrate. Of course, the light speed barrier might become a problem if you get really far away from the thing.
#8
Posted 01 March 2008 - 01:20 AM
#9
Posted 01 March 2008 - 02:12 AM
Scott Adams develop an interesting theory in his book "God's debris". Since God was able predict the future, he was very bored. So, he decided to blow itself up to see what will happen next. That was the Big Bang. It's not a concept I believe in but it's quite interesting.
I will let Marvin have the final word:
"Here I am, brain the size of a planet, and they get me to take you down to the bridge. Call that job satisfaction? 'Cos I don't."
#10
Posted 01 March 2008 - 09:53 AM
#11
Posted 01 March 2008 - 10:17 AM
I want to become much much much much much more smarter than I am now, that's for sure, but I can't say how far I would go since none of us have any idea of what the ramifications of having such intelligence even are.
#12
Posted 01 March 2008 - 04:11 PM
I think it's somewhat silly to say that we want unlimited intelligence. We cannot, by definition, understand how being that intelligent would feel. Without challenges, life would be terribly boring. It would be impossible to keep our current personnality because our personnality is in part defined or influenced by our knowledge and our intelligence.
How you feel depends at more on your emotion systems than on your intelligence. We need challenges now because we are only capable of being interested in things that challenge us. However, this doesn't necessarily have to be so. We could, theoretically augment our intelligence while maintaining the feeling engagement with whatever interests us now.
As for whether our personality would be the same: that depends on how you choose to define your identity. Personally, I don't want to define myself by my limitations. As long as I had the same interests and values, and I wasn't less intelligent then I am now, I'd consider myself the same person.
#13
Posted 01 March 2008 - 04:30 PM
I think I will upgrade my intellect as soon as intelligence-modifying components are agreed as safe to use. I will upgrade to become as intelligent as possible. But I will not however tamper with my emotions using super-drugs, or anything of the sort, since I believe meditation is the best way to do this. That way I am not reliant on the drug, my emotions of compassion are therefore entirely self-produced. Also, I think I would prefer to continue reading, rather than have information downloaded directly into my brain, this my change though. I wish to develop wisdom and compassion on my own, and leave boosting my information capacity and learning capabilities to new technologies.
- Adam
#14
Posted 01 March 2008 - 07:47 PM
#15
Posted 01 March 2008 - 08:33 PM
jackinbox raises a good point though. Once you obtain more intelligence your outlook might change.
Edit: Typo
Edited by maestro949, 01 March 2008 - 08:34 PM.
#16
Posted 01 March 2008 - 11:02 PM
#17
Posted 02 March 2008 - 12:19 AM
I'd connect myself to a network of other minds and control them.
Don't you think there would be safeguards against that? I definitely wouldn't connect if it would make me vulnerable to mind control.
#18
Posted 02 March 2008 - 02:13 AM
#19
Posted 02 March 2008 - 03:09 AM
I would prefer a better memory than an improved intelligence. Knowledge is wisdom!
Why not both?
#20
Posted 02 March 2008 - 04:10 AM
I'd connect myself to a network of other minds and control them.
Don't you think there would be safeguards against that? I definitely wouldn't connect if it would make me vulnerable to mind control.
Of course I would do it without being detected, and it could be subtle control, through images and movies during sleep, and eventually total psychological and mechanical control of the link.
#21
Posted 02 March 2008 - 03:49 PM
I'd connect myself to a network of other minds and control them.
Don't you think there would be safeguards against that? I definitely wouldn't connect if it would make me vulnerable to mind control.
Of course I would do it without being detected, and it could be subtle control, through images and movies during sleep, and eventually total psychological and mechanical control of the link.
What you're imagining sounds more like a hive mind than a knowledge base. I was thinking that it would be something you could read but not modify and that you could access on demand but wouldn't affect your brain without your knowledge and consent.
#22
Posted 02 March 2008 - 04:10 PM
I'd connect myself to a network of other minds and control them.
Actually this doesn't sound that far from what the powers that be try to do with all of the existing media outlets. Whether it be "Buy our Products", "Hate Liberals", or "Believe in our Moral Codes and Follow Our Rituals and You'll be Saved", the name of the game is dictating behavior through viral memes.
Don't you think there would be safeguards against that? I definitely wouldn't connect if it would make me vulnerable to mind control.
But you are connected...
Edited by maestro949, 02 March 2008 - 04:11 PM.
#23
Posted 07 March 2008 - 01:39 AM
#24
Posted 15 March 2008 - 09:43 PM
#25
Posted 15 March 2008 - 09:55 PM
And of course I'd strive to dominate the universe after thatI'd connect myself to a network of other minds and control them.
#26
Posted 15 March 2008 - 10:09 PM
Yeah, we're on the same path.Here's a thought. If limited amounts of matter became a problem, we could pool all of it into a universally accessible mega-knowledge base that people could access remotely through their own brains (with some sort of chip, maybe). You'd keep your memories and personality in your own brain but everyone would be able to access all of the universe's conceptual knowledge on demand without competition for substrate. Of course, the light speed barrier might become a problem if you get really far away from the thing.
I think that we enjoy 'expanded real time thought' too much. Instead of turning a distant star into a communal calculation machine, I think it would better to just incorporate an entire star into my mind. That way I can have maximum consciousness. And then other people could have other stars, and we could continue to trade thoughts (with hellishly slow delays) like we do today.
#27
Posted 16 March 2008 - 03:02 AM
#28
Posted 16 March 2008 - 04:01 PM
#29
Posted 19 March 2008 - 09:27 PM
This is pretty much a blue pill red pill question...
Haha so true. Anyway I chose other just because I was actually unsure of what to choose. Gaining the superintelligence described is kind of like using cheat codes in Quake. After you play around with the cheat codes for a while the game becomes boring. But then again I think that there will always be something to learn in the universe. For example: did you know that women blink about twice as often as men do?
sponsored ad
#30
Posted 20 March 2008 - 05:08 AM
1 user(s) are reading this topic
0 members, 1 guests, 0 anonymous users