Oops, here's the context of the disputed quote...
Q: How long before the singularity?
M: Marvin Minsky says 4 to 400 years. Vernor Vinge says thirty years. But he’s not an AI expert. My opinion is that the hardware to produce an AI, unless someone comes up with a radical superior way of implementing Ai, a total other unprecedented breakthrough and people have been trying to do that for 40 years and since there’s nothing like that around now, then nobody’s going to be able to do that, then nobody’s going to be able to do that within the next 15 years. It could take any amount of time.
M: But we believe it will happen eventually and it’s worth doing something about. We don’t think we should hold back AI.
E: [two indecipherable words] Because of the prospect of having a major positive technological advancement.
M2: I think the chance of saving the world is less than 1%.
E: I think it’s 2%.
M: That the world needs to be saved at all as opposed to we’re wrong is probably about 75% likely.
As for the question of what AI looks like, well this is something that obsesses many XHs and certainly sparks many imaginations. I interpreted Eliezer's quote as meaning that he's not concerned with the superficial elements, robotics, etc.
Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.

Eliezer Yudkowsky's prediction...
Started by
Jay the Avenger
, Jan 15 2004 09:21 PM
35 replies to this topic
#31
Posted 22 March 2004 - 09:02 PM
#32
Posted 22 March 2004 - 11:02 PM
Hm, hi Danielle, thanks for not including stuff from Eliezer's site. I guess the point of both articles was to make Eliezer look like a nutso nerd rather than a serious threat, although few people would be likely to believe the latter, so the former was the obvious choice. Removing Eliezer's genuine humorous side also helps for the nerd portrayal thing. The next question becomes, "how far do Singularitarians need to go in order to not be viewed as nerds?" Sometimes it seems like the mental connection between the subject matter and the personalities of its star thinkers is so automatic that even if we attended conferences with purple mohawks and tribal tattooes we would still be labelled as nerds.
If I said "the world needs to be saved at all as opposed to we’re wrong is probably about 75% likely" (I don't remember doing so), then I retract that. It's nearly impossible to estimate that sort of thing. And yeah, thanks for not including in the article that I smoked pot in the past.
Also, I guess all of this means that you think either...
1) the first AIs will act like the dumb AIs in movies, held down by some glass ceiling that prevents them from outsmarting that dashing human hero.
2) AI is a thousand years off, because our intellectual faculties are so complex and subtle that they could never be duplicated in a "machine".
3) AIs, when created, will automatically act nice, because it's obvious that any sufficiently intelligent entity will realize that other people deserve respect.
But anyhoo, thanks for taking the time to come here and give us some info!
If I said "the world needs to be saved at all as opposed to we’re wrong is probably about 75% likely" (I don't remember doing so), then I retract that. It's nearly impossible to estimate that sort of thing. And yeah, thanks for not including in the article that I smoked pot in the past.

Also, I guess all of this means that you think either...
1) the first AIs will act like the dumb AIs in movies, held down by some glass ceiling that prevents them from outsmarting that dashing human hero.
2) AI is a thousand years off, because our intellectual faculties are so complex and subtle that they could never be duplicated in a "machine".
3) AIs, when created, will automatically act nice, because it's obvious that any sufficiently intelligent entity will realize that other people deserve respect.
But anyhoo, thanks for taking the time to come here and give us some info!
sponsored ad
#33
Posted 23 March 2004 - 01:14 AM
Michael, I did have to pick quotes which naturally shaped the tone. Ideally it would have been thirty pages with much more information, starting from the first time I met all of you Singularitarians at the TV thing (your comical singularitarian chant was the first bit to go on edit) to my last email from Eliezer. In full context the 2% quote seemed to me to be Eliezer reacting to Michael's 1% to show that he's perhaps more optomistic about the future. With reducing, and in many instances for the sake of clarity and brevity, almost everything has to come out of context. Given that, I let the quotes speak for themselves and the article is mostly quotes.
As for the nerd subtext, that was added by the San Fran Chronicle people. It wasn't in my original Adbusters profile, which has no title. I had nothing to do with that subtext. Although I did hear this revenge of the nerds stuff at TV 2003 and have seen it written by others.
I don't get this nutso nerd versus serious threat thing. I don't view Eliezer exclusively as either of those things. Serious threat to whom? Who's getting scary now? I view the potential use of technology by warmongers as a serious threat and admire Eliezer's goal for creating a FAI. Eliezer in fact is shown to detest many things people stereotype as nerd loves: potato chips, TV, apathy, getting into the cheerleader's pants. Most of the article is about his goal for FAI. There are many sane, reasonable quotes here, esp, my favourite: "Strange as the Singularity may seem, there are times when it seems much more reasoable, far less arbitrary than life as a human. We'll do it or die trying. I won't make any move unless I think I'm really, actually going to succees, even after all human stupidity is taken into account..."
It's exactly this kind of quote that captivates so many people. I believe there is reason behind all of his arguments in this article and otherwise. I didn't set out to discredit him, quite the opposite. He was one of the most interesting people I met at TV 2003, partly because he's not afraid to discuss his fears and largely because he is very dedicated to his pursuit.
1.) maybe it will not be some hunk with glaring white teeth. Maybe Eliezer with his stooped shoulders and alleged virginity (people have actually sent me emails to dispute the virgin thing saying it's personna-building hype! one person at TV said that prior to the conference she wondered if Eliezer was in fact an AI) 2. Great minds haven't dared to put a number on it, so no thanks. 3. this is an idea I am optomistic and hopeful about with AI, humans, even journalists. This is what got me fascinated with Singularitarianism to begin with.
My advice on handling the media and the public is not to fear all the emotional responses and even the trigger stereotyping some, maybe many people will make based on information and hairstyle or lack thereof. XH started for so many people with an emotional response to something they read. I think there are some great rally calls in this Eliezer piece that might get some kid off the Pringles/Jolt/Doom spiral and involved in something.
You will all become much more thick-skinned by your 80th magazine article. Meantime, I certainly didn't want to offend any of you, esp Eliezer. So, sorry if I did.
thank you for listening and picking back up on this thread. I just found it today which is why the flurry.
As for the nerd subtext, that was added by the San Fran Chronicle people. It wasn't in my original Adbusters profile, which has no title. I had nothing to do with that subtext. Although I did hear this revenge of the nerds stuff at TV 2003 and have seen it written by others.
I don't get this nutso nerd versus serious threat thing. I don't view Eliezer exclusively as either of those things. Serious threat to whom? Who's getting scary now? I view the potential use of technology by warmongers as a serious threat and admire Eliezer's goal for creating a FAI. Eliezer in fact is shown to detest many things people stereotype as nerd loves: potato chips, TV, apathy, getting into the cheerleader's pants. Most of the article is about his goal for FAI. There are many sane, reasonable quotes here, esp, my favourite: "Strange as the Singularity may seem, there are times when it seems much more reasoable, far less arbitrary than life as a human. We'll do it or die trying. I won't make any move unless I think I'm really, actually going to succees, even after all human stupidity is taken into account..."
It's exactly this kind of quote that captivates so many people. I believe there is reason behind all of his arguments in this article and otherwise. I didn't set out to discredit him, quite the opposite. He was one of the most interesting people I met at TV 2003, partly because he's not afraid to discuss his fears and largely because he is very dedicated to his pursuit.
1.) maybe it will not be some hunk with glaring white teeth. Maybe Eliezer with his stooped shoulders and alleged virginity (people have actually sent me emails to dispute the virgin thing saying it's personna-building hype! one person at TV said that prior to the conference she wondered if Eliezer was in fact an AI) 2. Great minds haven't dared to put a number on it, so no thanks. 3. this is an idea I am optomistic and hopeful about with AI, humans, even journalists. This is what got me fascinated with Singularitarianism to begin with.
My advice on handling the media and the public is not to fear all the emotional responses and even the trigger stereotyping some, maybe many people will make based on information and hairstyle or lack thereof. XH started for so many people with an emotional response to something they read. I think there are some great rally calls in this Eliezer piece that might get some kid off the Pringles/Jolt/Doom spiral and involved in something.
You will all become much more thick-skinned by your 80th magazine article. Meantime, I certainly didn't want to offend any of you, esp Eliezer. So, sorry if I did.
thank you for listening and picking back up on this thread. I just found it today which is why the flurry.
#34
Posted 23 March 2004 - 02:09 AM
Thanks for your pleasant response. Ugh, I wasn't aware that the SF Chron thing was their doing, but I'll let you know that I practically sprayed my coffee all over the newspaper when I accidentally noticed that in the morning. 
Oops, regarding the threat thing; some people are scared that Eliezer will literally destroy the world at some point with a self-improving AI that isn't actually friendly. This angle *has* been used in the past.
Lots of people write articles that focus Eliezer as a person, with the SingInst's mission as a side issue. These articles *always* end up portraying him negatively, so why should any Singularitarian ever concede to talking to any member of the media ever again? The chances of a supportive article are close to nil because the subject is so radical and fringe-y, and the underlying arguments are heavily technical. Since the SingInst is trying to get actual donors, it's probably best if thousands of people's first exposure to the Singularity isn't through these articles.
(Nitpick: Did Eliezer actually say that at age 200 he's going to add on new cognitive capacity to prevent from being bored, and at age 2000, he would "probably need serious architectural changes to the mind"? This doesn't sound like something he'd say he personally wants to do, but something that's he'd probably say that *any* human that lives such a long life would need to do. The human is just not built to last for that long. Also, he does not *plan* to be alive after every star in the Milky Way is dead, but *hopes* it, just like many of us here at ImmInst.)
Heh, the "Pringles/Jolt/Doom spiral". Rereading the article, I guess it's not as awful as I thought, but I'm certainly not used to seeing these types of statements packed into such a short piece; the extended justifications are not there, so any reasonable person (including myself in an alternative universe where I haven't been exposed to sing-ism yet) is going to go "ehhh, these people are pretty weird-sounding". But any publicity is good publicity I guess, so thanks for helping us get more exposure.
By the way, are you familiar with Oxford philosopher Nick Bostrom's paper on Singularity issues? http://www.nickbostr.../ethics/ai.html It's just not computer nerd kids that Singularitarianism's ideas are intriguing, but middle-aged academics.

Oops, regarding the threat thing; some people are scared that Eliezer will literally destroy the world at some point with a self-improving AI that isn't actually friendly. This angle *has* been used in the past.
Lots of people write articles that focus Eliezer as a person, with the SingInst's mission as a side issue. These articles *always* end up portraying him negatively, so why should any Singularitarian ever concede to talking to any member of the media ever again? The chances of a supportive article are close to nil because the subject is so radical and fringe-y, and the underlying arguments are heavily technical. Since the SingInst is trying to get actual donors, it's probably best if thousands of people's first exposure to the Singularity isn't through these articles.
(Nitpick: Did Eliezer actually say that at age 200 he's going to add on new cognitive capacity to prevent from being bored, and at age 2000, he would "probably need serious architectural changes to the mind"? This doesn't sound like something he'd say he personally wants to do, but something that's he'd probably say that *any* human that lives such a long life would need to do. The human is just not built to last for that long. Also, he does not *plan* to be alive after every star in the Milky Way is dead, but *hopes* it, just like many of us here at ImmInst.)
Heh, the "Pringles/Jolt/Doom spiral". Rereading the article, I guess it's not as awful as I thought, but I'm certainly not used to seeing these types of statements packed into such a short piece; the extended justifications are not there, so any reasonable person (including myself in an alternative universe where I haven't been exposed to sing-ism yet) is going to go "ehhh, these people are pretty weird-sounding". But any publicity is good publicity I guess, so thanks for helping us get more exposure.
By the way, are you familiar with Oxford philosopher Nick Bostrom's paper on Singularity issues? http://www.nickbostr.../ethics/ai.html It's just not computer nerd kids that Singularitarianism's ideas are intriguing, but middle-aged academics.
#35
Posted 23 March 2004 - 05:52 PM
Eliezer did say all those things about upgrading in an email interview. In response to whether he'd like to be uploaded. He said he'd like to be upgraded so I asked him to explain. The Milky Way quote is actually lifted from his speech at TV 2003 which he was kind enough to send me. I actually wanted that sentence in quotes since he said it. The way it is, it looks like I'm paraphrasing when I'm actually not. But the editor ignored this request. I was at TV for the speech but my notes were sketchy. Here's that particular graph:
So there is an immense amount of fun out there to be had. Even for humans, the size of human Fun Space is so large that no one individual could possibly succeed in experiencing all the possible kinds of human fun. One person cannot fulfill more than a tiny, infinitesimal fraction of the human space of possibilities. Why? Because one person's lifespan is too short? Of course not. I plan to still be alive after the last star in the Milky Way is dead. But even if you live forever, you will not be able to explore more than a tiny fraction of human fun space before you outgrow it. And that, of course, is the scary part - growing up.
Michael, I agree that the details aren't there and even if Eliezer had provided more elaborations I couldn't have guaranteed those quotes would have been used. To explain the goings-on to the general public would be difficult and who knows maybe he didn't want that info going out. Elaborate on why the Sings have made FAI the main goal? That would mean going over all sorts of dystopian scenarios like goo, etc. Then, I'd be facing more charges of scare-mongering. I also wonder if the Sings are trying to maintain a certain amount of mystique which is fine but with that comes misunderstanding and the potential that the media will leap to conclusions. What do you think about that?
Yeah, you couldn't help but notice the number of middle-aged academics at that conference. Good, fine, partly for funding, credibility and such. But don't you think it's important to capture the attention of young people? Otherwise, isn't there a risk of Singularitarianism becoming some sort of historic fad; a footnote. I wonder if younger people will take the emerging technologies in stride as extensions of themselves and maybe some aren't even looking at the ethics and the philosophies of futurism. You are young so you would know this better than I would. But I believe future generations of XHs and their critics are necessary to make sure society in general is self-analytical. Otherwise I think that people will rush blindly into adapting technologies. Just because someone has a few degrees under his belt doesn't mean he won't rush to get upgraded for at least some of the same reasons a 16 year-old gets a boob job on her birthday: to impress peers. These technologies won't necessarily make people nicer, better, kinder people whether their Harvard profs or not. So, I appreciate that many XHs are paying attention to the potential implications.
I know many non-XHs who wonder if we wouldn't be better ruled by an AI and let the cards fall where they may. Many people dissatisfied by the present state of humanity. Do we all have a death wish? Many people prefer to play the patient role in deciding their fate. Couldn't these people be considered nutso?
Looking at the above posts, it's interesting how many negative scenarios are discussed. Yet if the media discusses these negative scenarios, we're shock jockeying. It's all pretty darn weird.
I appreciate the Bostrom link. I have read it and if you have any other good links, please feel free to send them to me. Yes, Eliezer was worried about being singled out and in singling out the Sings as well instead of talking about XH in general. Why risk offending the group at large when so many group members have different interests and varied approaches? I understand the importance of getting the movement up and running and the worry the whole movement will be painted cultishly. But that hive mentality could smack as fundamentalism to some people.
Interestingly, the last time Adbusters tackled XH in any way was by running an except of Bill McKibben's book Enough: Staying Human in an Engineered Age, in the May/June 2003 issue called Nightmares of Reason. I didn't read the book but was really dissatisfied with the except and the basic premise that we should just say no to emerging technologies. THe specific quote is "What makes us special is that we can say no. We can restrain ourselves. We are special because we set limits on our desires." Again, this is merely one except from an entire book so I won't rush to a conclusion on the book. But I suspect that even this rally call to "mount some resistance to this version of our future," fell flat for critics of XH and piqued the interest of XH friendly types. Generally, XHs seemed much more exotic and novel than the old-fashioned just say no camp.
So there is an immense amount of fun out there to be had. Even for humans, the size of human Fun Space is so large that no one individual could possibly succeed in experiencing all the possible kinds of human fun. One person cannot fulfill more than a tiny, infinitesimal fraction of the human space of possibilities. Why? Because one person's lifespan is too short? Of course not. I plan to still be alive after the last star in the Milky Way is dead. But even if you live forever, you will not be able to explore more than a tiny fraction of human fun space before you outgrow it. And that, of course, is the scary part - growing up.
Michael, I agree that the details aren't there and even if Eliezer had provided more elaborations I couldn't have guaranteed those quotes would have been used. To explain the goings-on to the general public would be difficult and who knows maybe he didn't want that info going out. Elaborate on why the Sings have made FAI the main goal? That would mean going over all sorts of dystopian scenarios like goo, etc. Then, I'd be facing more charges of scare-mongering. I also wonder if the Sings are trying to maintain a certain amount of mystique which is fine but with that comes misunderstanding and the potential that the media will leap to conclusions. What do you think about that?
Yeah, you couldn't help but notice the number of middle-aged academics at that conference. Good, fine, partly for funding, credibility and such. But don't you think it's important to capture the attention of young people? Otherwise, isn't there a risk of Singularitarianism becoming some sort of historic fad; a footnote. I wonder if younger people will take the emerging technologies in stride as extensions of themselves and maybe some aren't even looking at the ethics and the philosophies of futurism. You are young so you would know this better than I would. But I believe future generations of XHs and their critics are necessary to make sure society in general is self-analytical. Otherwise I think that people will rush blindly into adapting technologies. Just because someone has a few degrees under his belt doesn't mean he won't rush to get upgraded for at least some of the same reasons a 16 year-old gets a boob job on her birthday: to impress peers. These technologies won't necessarily make people nicer, better, kinder people whether their Harvard profs or not. So, I appreciate that many XHs are paying attention to the potential implications.
I know many non-XHs who wonder if we wouldn't be better ruled by an AI and let the cards fall where they may. Many people dissatisfied by the present state of humanity. Do we all have a death wish? Many people prefer to play the patient role in deciding their fate. Couldn't these people be considered nutso?
Looking at the above posts, it's interesting how many negative scenarios are discussed. Yet if the media discusses these negative scenarios, we're shock jockeying. It's all pretty darn weird.
I appreciate the Bostrom link. I have read it and if you have any other good links, please feel free to send them to me. Yes, Eliezer was worried about being singled out and in singling out the Sings as well instead of talking about XH in general. Why risk offending the group at large when so many group members have different interests and varied approaches? I understand the importance of getting the movement up and running and the worry the whole movement will be painted cultishly. But that hive mentality could smack as fundamentalism to some people.
Interestingly, the last time Adbusters tackled XH in any way was by running an except of Bill McKibben's book Enough: Staying Human in an Engineered Age, in the May/June 2003 issue called Nightmares of Reason. I didn't read the book but was really dissatisfied with the except and the basic premise that we should just say no to emerging technologies. THe specific quote is "What makes us special is that we can say no. We can restrain ourselves. We are special because we set limits on our desires." Again, this is merely one except from an entire book so I won't rush to a conclusion on the book. But I suspect that even this rally call to "mount some resistance to this version of our future," fell flat for critics of XH and piqued the interest of XH friendly types. Generally, XHs seemed much more exotic and novel than the old-fashioned just say no camp.
sponsored ad
#36
Posted 24 June 2004 - 12:08 PM
Danielle, if it was off a taped recording then I apologize for implying that the fault was yours. The quote sounded completely unlike me - I don't believe our chance of survival is 2%, and I wouldn't have thought I would have said it. Maybe it was in a joking tone? Regardless, my apologies for the imputation.
1 user(s) are reading this topic
0 members, 1 guests, 0 anonymous users