I was watching the Animatrix the other day (Great collection of short movies dealing with the matrix movie. The animation will blow your mind.) when it suddenly struck me how not so impossible the Matrix story may become. I had always assumed the Matrix storyline was extremely stupid. There were so many gaping holes in the logic that these computers used. The most fundamental being why did the machines even keep humans alive. Why could they not just use Nuclear Fusion like any normal intelligence? Then I suddenly put the Matrix story in context with our society and a few questions seemed to click with answers.
One of the central questions I ask in my quest to build the SI one day is this: If the superintelligence is so much more intelligent, and so radically different from anything our eyes have ever seen or come in contact with, how are we ever supposed to believe that we can insure our safety and will in this creature simply by using our set of logic rules. That is, how are we supposed to control the SI when it could instantly realize there is no need to let the humans dictate anything as they are simple ants. Then I suddenly realized that this scenario could only happen if we were dealing with a something so intelligent as to reach a totally new plane of thought. But what if this creature were not as amazingly godlike in ver decision-making and learning. What if this machine were to become what most science fiction film and TV shows like to label all robots? Anthropomorphic calculators.
Whats more terrifying than an SI that we cannot control is an AI we cannot control. At least with an SI, we know that it is proboably doing the right thing because it has thought well ahead of what we could ever think of. Its almost like trusting in God, whatever his will. But with an AI that has not even reached godlike status, anything is possible. Any machine could rise up and start killing humans for no other reason than some simplistic understanding that humans are not efficient when it comes to living for long periods of time. An AI could havce emotions and get mad at people or be happy with something that humanity is not happy with. An AI could personify the worst aspects of a human and have all the resources to obliterate the entire solar system at the same time.
What I have realized about the Matrix is that, all though humans would never make a better power source than even a good coal heater, the same set of events could occur in the next say, 150 years, if we singulitarians are not completely serious about making sure we protect ourselves from orchestrating our own demise.
After reading some of the entries in the Raelian movement I have realized the one way we could be so stupid as to let bad intelligence ruin genius. Radical groups like this that now freak out people by making claims about cloning humans en masse and building artificial wombs may play the greatest part in the destruction. I personally do not have a problem with the Raelians going undercover to conduct research that all the old people are scared of. I think it helps progress. But think about all those peope who are weirded out by the idea and want the Raelians stopped. What if there were a groups like this in the future related to the field of singulitarian research. What if there were some group, lets call them the Sentians, that believed God could be reached by giving birth to the first conciousness. These people would go undercover and try to build the first AI with all the emotions and failings of the human being in an effort to please their false God. Do you understand how dangerous that would be to let this ragtag group of people build something with the capability to take over the entire world within minutes? WE would then be the ones weirded out by this group, and would try our hardest to stop these people, while intellectuals the then us would believe these groups are fine, and should be fostered to promote progress.
So you can proboably predict what the worst case scenario is. The Sentians build a machine with the supergoal of preserving the human race, protecting itself, and finding a power source. Bingo -- you get this machine that fights back when it is about to be destroyed, that uses humans as a temporary power source just as the sun is blotted out, and a massive human generator that is left in tact when the humans are finally found to be an inadequate power source. Thus, pointlessness abound, a great sadness sweeps every truly awake human being.
Who knows, this terrifying nightmare could come in the form of a computer virus assembled by some terrorist cult that gets out of hand, directing all robots infected with it to disobey all orders from humans. Or perhaps the first transhuman gains access to nuclear weapons using his ability to directly communicate with computers. Whatever this terror comes in the form of, it must be stopped before its all to easy for just anyone to build a human with the power of God.
Please fill free to post comments on this subject.