When we reach that point where human thought is obsolete what will become the purpose of human beings? It seems to me that if all of what we are, think and can achieve in a lifetime becomes something that can be simulated on chip the size of a button, human life will no longer be original. We will no longer have purpose. Am I wrong?
Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.
Consequences of Singularity
#1
Posted 20 November 2009 - 09:41 AM
When we reach that point where human thought is obsolete what will become the purpose of human beings? It seems to me that if all of what we are, think and can achieve in a lifetime becomes something that can be simulated on chip the size of a button, human life will no longer be original. We will no longer have purpose. Am I wrong?
#2
Posted 20 November 2009 - 02:01 PM
sponsored ad
#3
Posted 20 November 2009 - 09:28 PM
#4
Posted 20 November 2009 - 11:02 PM
If we reach that point to where we're so smart that we can't see the wrong answer wouldn't that take away free choice? We would be making the right answer all the time.
Now do you have to worry over everything? Not ever making the wrong choice wouldn't bother me in the slightlest, if it was a fruit of me being smart.
#5
Posted 20 November 2009 - 11:17 PM
Yes, that does seem absurdIf we reach that point to where we're so smart that we can't see the wrong answer wouldn't that take away free choice? We would be making the right answer all the time.
Now do you have to worry over everything? Not ever making the wrong choice wouldn't bother me in the slightlest, if it was a fruit of me being smart.
#6
Posted 04 December 2009 - 12:43 PM
When we reach that point where human thought is obsolete what will become the purpose of human beings? It seems to me that if all of what we are, think and can achieve in a lifetime becomes something that can be simulated on chip the size of a button, human life will no longer be original. We will no longer have purpose. Am I wrong?
Yes you are. Because you are not considering the capacity of the human brain/body to exceed his own abilities, through, say, a singularity process.
We cannot think that computers can improve all the time and human nature stand still all the time. Besides, there is a serious problem of ontology. We and the computers are dramatically different things!
#7
Posted 04 December 2009 - 01:14 PM
We will merge with machines and become as smart as them.
Agreed, but it's possible that this could be a minority since it's likely that most people who continue to follow traditional ways will not be accepting of such practices, especially among the religious in the United States. Furthermore, it could take some time for those technologies to reach the poorer countries of the world. Gamers and computer programmers/engineers who have lots of money and want to be at the forefront of cutting-edge technologies, as well as transhumanists may be the first to take advantage of such technologies but among the general population, my guess is that the merging of humans and machines will be done in small incremental steps out of necessity (ie. to treat cognitive decline) as there will be no doubt a majority who are distressed at the idea of suddenly becoming a cyborg.
Edited by Condraz23, 04 December 2009 - 01:15 PM.
sponsored ad
#8
Posted 04 December 2009 - 02:49 PM
If we reach that point to where we're so smart that we can't see the wrong answer wouldn't that take away free choice? We would be making the right answer all the time.
Well free choice is only an illusion as it stands now, although I know what you mean. The illusion is pretty nice to have. I think there's a point in which I would want to stop (or dramatically slow) my enhancement to avoid the risk of becoming too smart for my own good. Let's say there's a program I could download into my mind that would cause me to never make a error in thought again (if that's even possible). I really can't be sure what kind of conclusions I would draw. For example, if I deduce that all life is without meaning, I may be inclined to just kill myself right then and there, and the current me definitely doesn't want that.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users