[*]Humans should not stand in the way of a higher form of evolution. These machines are godlike. It is human destiny to create them.
Screw that, I'm taking out Skynet before it takes me out.
Posted 27 October 2008 - 01:37 PM
[*]Humans should not stand in the way of a higher form of evolution. These machines are godlike. It is human destiny to create them.
Posted 27 October 2008 - 02:24 PM
Ah yes, thank you. I remembered that quote.Got this from wikipedia:
De Garis:
"
- Humans should not stand in the way of a higher form of evolution. These machines are godlike. It is human destiny to create them.
— as quoted in New York Times Magazine of August 1, 1999, speaking of the 'artilects' of the future.
"
Posted 27 October 2008 - 02:25 PM
"
- Humans should not stand in the way of a higher form of evolution. These machines are godlike. It is human destiny to create them.
— as quoted in New York Times Magazine of August 1, 1999, speaking of the 'artilects' of the future.
"
Posted 27 October 2008 - 02:28 PM
I have no problem at all with "These machines are godlike. It is human destiny to create them." - I agree completely."
- Humans should not stand in the way of a higher form of evolution. These machines are godlike. It is human destiny to create them.
— as quoted in New York Times Magazine of August 1, 1999, speaking of the 'artilects' of the future.
"
"Humans should not stand still on the path to a higher form of evolution. We are godlike. It is human destiny to create us."
I don't really like the religious language at all. If anything, that only incites people. We in the thinking business have to always be conscious of branding.
Edited by Mind, 27 October 2008 - 04:47 PM.
Posted 27 October 2008 - 02:33 PM
Edited by Matt, 27 October 2008 - 02:38 PM.
Posted 27 October 2008 - 04:20 PM
The point was that he seemed to be knowingly persuing this as a goal o_OI've never heard him actually say he wanted all these billions of people to die, he just thinks it's probably inevitible outcome of our progress.
Missing the point here. For one, his AI approach isn't hastening progress toward AI in any real way...Yes maybe he is hastening the progress by his 'brain building' projects
Agreed.If artilects take over from humans, I'm not sure this is a bad thing. My optimisic future is that I evolve along side machines.
Edited by Savage, 27 October 2008 - 04:25 PM.
Posted 27 October 2008 - 04:21 PM
Posted 27 October 2008 - 04:27 PM
sometimes you just have to pretend you didn't hear someone say something... Shannon ... O_OYes, I've long been a SI supporter. I followed up that what I thought, is not likely to be what will be--as no one really knows. There are many proposed goals of AI. de Garis in particular does feel humans will become extinct, but does not view that as a bad thing.
You can listen to the whole recording at the Ustream channel, it is quiet fascinating and he is articulate--does not sound insane to me.
Edited by Savage, 27 October 2008 - 04:30 PM.
Posted 27 October 2008 - 04:59 PM
Missing the point here. For one, his AI approach isn't hastening progress toward AI in any real way...
Posted 27 October 2008 - 05:13 PM
k. not the impression i have been getting. anyway, enough of the de garis bashingMy feeling based on the content of the interview is that he would like to see AI developed (or a cyborg evolution) because he is curious to see the next step in evolution (as many of us are) and achieve things like practical immortality.
Posted 27 October 2008 - 05:42 PM
k. not the impression i have been getting. anyway, enough of the de garis bashing
Posted 27 October 2008 - 05:56 PM
Hahaha. pretty funny.I doubt de Garis is REAL close to developing a TRUE AGI, so we will have a few years to monitor things and find out his true intentions.
0 members, 1 guests, 0 anonymous users