Well, I realize that Post Singularity is sort of a contradiction in terms, but I put a few thoughts up about the topic here:
roboslay.com
Please check out my ramblings and let me know what you think!
Posted 05 February 2007 - 06:33 PM
Posted 05 February 2007 - 10:08 PM
If it terminated all desires... I don't think it would choose such an action on the basis of saving energy... it wouldn't careNow, the cloud mind could go even one step beyond that, deciding to shut down all desires, even those for happiness and its own survival. If those desires are turned off, the cloud mind will feel no objection to simply dispersing and fading from existence.
That would certainly be the least energy-consuming route.
Posted 06 February 2007 - 01:10 AM
btw: I liked your ending... I've thought about that same situation: If I or any of the countless AI researchers succeeded in creating something that radically changed our life... it would almost seem like one's worth would be determined by their pre-singularity achievements or actions, because after the singularity, anyone can have or do anything... (learning additional languages being a prime example)
Blah blah blah boring and obvious
it's called "wireheading", it's not new, and "post-Singularity" isn't a contradiction in terms
Posted 06 February 2007 - 01:41 PM
Posted 10 February 2007 - 08:50 AM
Posted 10 February 2007 - 11:22 AM
You could always introduce some self-induced limitations and selective amnesia, and then arrange to be placed in a challenging environment, either real or simulated. Maybe you have done this already and are currently enjoying the challenges of life as a mortal in a simulated 21st century environment?because after the singularity, anyone can have or do anything...
Wow this sounds incredibly boring, more like hell to me. Life isn't fun without some kind of effort or challenge to obtain what you want.
Posted 10 February 2007 - 07:56 PM
Wow this sounds incredibly boring, more like hell to mebecause after the singularity, anyone can have or do anything...
. Life isn't fun without some kind of effort or challenge to obtain what you want.
Posted 10 February 2007 - 11:56 PM
because after the singularity, anyone can have or do anything...
Wow this sounds incredibly boring, more like hell to me. Life isn't fun without some kind of effort or challenge to obtain what you want.
You could always introduce some self-induced limitations and selective amnesia, and then arrange to be placed in a challenging environment, either real or simulated. Maybe you have done this already and are currently enjoying the challenges of life as a mortal in a simulated 21st century environment?
Posted 11 February 2007 - 12:24 AM
Nice job! I liked it, however...
If it terminated all desires... I don't think it would choose such an action on the basis of saving energy... it wouldn't careNow, the cloud mind could go even one step beyond that, deciding to shut down all desires, even those for happiness and its own survival. If those desires are turned off, the cloud mind will feel no objection to simply dispersing and fading from existence.
That would certainly be the least energy-consuming route.
btw: I liked your ending... I've thought about that same situation: If I or any of the countless AI researchers succeeded in creating something that radically changed our life... it would almost seem like one's worth would be determined by their pre-singularity achievements or actions, because after the singularity, anyone can have or do anything... (learning additional languages being a prime example)
Posted 18 February 2007 - 10:53 AM
Humanity has merged with its machines, its intelligence level has increased exponentially, and everyone can communicate with each other instantaneously - billions of minds have become the cells of one giant mind.
Posted 04 March 2007 - 11:41 AM
Do we know all humanity will become a hive mind? Maybe the hive mind will be searchable, like google, and intelligent entities will gather into communitarian type networks (as it is with the web now with forums and so on).
And, I get the impression that you are putting all sentients into one camp. The singularity, I think, will be about diversity. Everyone will have different customisations and capabilities to their own preferences. Some will shut out their desires, no doubt - others will use the 'pleasure button' and there will be those who simply just enhance their current biological chemistry (while still retaining the benefits of upgrading).
Besides upgraded humans, we will also most likely have ai agents continuously working on solving the problems you mention (again, diversity in all types of intelligence).
Posted 06 March 2007 - 04:22 AM
Posted 06 March 2007 - 07:32 AM
Posted 06 March 2007 - 01:56 PM
I'll be surprised if after the singularity there isn't a major nuclear war because of the power a country would have with a super intelligent AI. Not to mention the religious folk and all the other conservatives who don't like drastic change in their lives. Hopefully in 40 or 50 years when the singularity happens (if it ever does) it will be with a generation of people who are open to the possibilities rather than close minded idiots that we have running society currently.
Posted 07 March 2007 - 01:11 AM
'What are we going to do with our lives when farm machinery takes all our jobs?!?!?'
-Some past futurist
The idea of the "Singularity" is encouraging, but I do believe discussing the 'post singularity' is moot, as others have mentioned above.
We should attempt to redefine what we mean by this 'singularity.' Unless it is a 'hard take off' period, caused by some event such as AI, the Internet (which IS an AI), the printing press, or prefabricated rings of cream cheese, us frogs won't notice how hot our technological bathwater is getting.
I think we need to stop discussing the Singularity from a macro-futurist perspective.
0 members, 1 guests, 0 anonymous users