• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
              Advocacy & Research for Unlimited Lifespans

- - - - -

Sl4 Chat Sept 4

  • Please log in to reply
No replies to this topic

#1 Bruce Klein

  • Guardian Founder
  • 8,794 posts
  • 242
  • Location:United States

Posted 05 September 2002 - 04:17 AM

<Ziana> hiya
<Joachim> Hi BJ
<BJKlein> hi there
<Anand> Eliezer: If not, possibly others could create a, say, Singularity Activists community site, or an SIAI Volunteers site.
<Eliezer> Anand may be a good person to forward any Singularitarian writing to, for example, or to see if he has a copy of something (including advance copies of SIAI materials, which I send him because he needs the raw material)
<Anand> Michael: Pretty much
<Anand> Michael: http://groups.yahoo....ngularityAction (Files section)
<MRAmes> I look forward to seeing the final versions... and using them too.
<Eliezer> BJKlein, meet Christian Rovner, aka xtian
<Anand> Mitchell Howe has three PR versions in the works as well.
<Anand> They're all good, in my opinion.
<Eliezer> Christian Rovner is working full-time on the Singularity and is SIAI's new Volunteer Coordinator
<MitchH> Good for initial release, actually, though I'll still be open to suggestions.
<MitchH> I was going to post a msg to SL4 with links tonight.
<Anand> Mitch: k
<MitchH> Started on a 4th today as well.
<Eliezer> Chris, BJKlein runs BJKlein.com, now known as the Immortality Institute at imminst.org, a website and forum devoted to people interested in immortality
<MRAmes> MitchH: Where is your stuff online?
*Anand* BJ: Do you have Anissimov's phone #?
<MitchH> I haven't put it up just yet. Anand: Any probs with latest revision that you saw?
*Anand* If so, could you ring him and ask him to get on IRC?
<Eliezer> Chris, can I post your email address?
<DmitriyM_plus_MikeLi> yo this is xgl
<Anand> Mitch: Getting ready to look through it. I just got up an hour ago. :)
<xtian> Sure: crovner@lacasilla.com.ar
<MitchH> ok then.
<Eliezer> Hi. Christian Rovner, meet Xiaoguang Li, also known as "Mike" or "XGL"
<BJKlein> sorry.. just returned
<Eliezer> XGL is one of the major people working on the Flare project; he's done most of the work on the interpreter so far
<xtian> Great. I'd like to help him, if I can
<BJKlein> Hello Christian Rovner, nice to meet you
<xtian> Hello, nice to meet you
<xtian> I remember BJKlein's name from Anissimov's Singularity Forums
<Ziana> wb hooky
<Eliezer> If possible, and if Dmitriy and Mike have no objections, I'd like Chris to act as Volunteer Coordinator for Flare as well
<DmitriyM_plus_MikeLi> no objection from me (xgl)
<xtian> I'm not sure I'm fit for that, though
<Eliezer> Which means that you would need to keep sufficiently close tabs on the project to know what kind of help they could use, be able to fit people in, advise them on how to join SourceForge, and so on
<Anand> Dmitriy: How's your job situation? Do you have much time for Flare right now?
<observer> (If xtian's nickname is a parallel to Christmas -> xmas, shouldn't his name be "xian"?)
<DmitriyM_plus_MikeLi> xtian, how did you come to want to devote your life to the singularity? :)
<xtian> Heh
<outlawpoet> observer, his name is christian
<MRAmes> xtian: To coordinate, you need coordination skills, not necessarily the language-creation/programming skills.
<outlawpoet> phonetically, i suppose
<MRAmes> xtian: So perhaps you will be okay coord. the flare project :)
<MitchH> xtian: How did you come to be *able* to donate your life to the Singularity.
<xtian> Basically, I agree with Eliezer's TMOL FAQ, and I don't need to get money for living
<chronophasiac> phew, lucky bastard
<xtian> So I just do it
<MRAmes> chrono: ditto on that comment....;)
<xtian> Yes, my family supports me (though they probably think I'm half nuts!)
<Eliezer> Chris, chronophasiac is Michael Raimondi
<MitchH> His good fortune is our good fortune... no hard feelings from me!
<observer> (I know, Christian -> xian, Christmas -> xmas, replace the letters "Christ" with "x". He left in the "t". But I guess "xtian" is easier to pronounce. Sorry for the distraction.)
<observer> (statements inside parentheses can safely be ignored)
<Eliezer> If I get hit by a truck tomorrow, Raimondi is the closest thing I have to a backup on knowledge of Singularity issues and ethics
<BJKlein> I will be back on latter tonight.... but, Xtian, it's great to have you on board with singinst.. let's roll!
<Joachim> Wow
<chronophasiac> hi, christian
<outlawpoet> Whatever works I suppose.
<xtian> Hi
<MitchH> Eliezer, I've decided that for the good of humanity we can no longer allow you to cross the street without a chaperone.
<outlawpoet> perhaps I missed it, But where are you located, xtian?
<Anand> Cheers BJ
<xtian> Buenos Aires, Argentina
<Eliezer> Raimondi can't actually take over the job of creating AI, but he could probably tell you whether a given person *could* take over that aspect of the job, again in the event that I get hit by a truck
<Joachim> I forget Raimondi's story; What do you do?
* MitchH pushes for legislation to have Eli-pushers installed on the front of all trucks.
<DmitriyM_plus_MikeLi> i drive a truck :)
<xtian> I see
<chronophasiac> joachim, i'm a full-time Singularitarian, at present i do whatever i can find to make enough money to subsist
<Ziana> eli- and what if he doesn't find that 'given person'? ;-)
<Joachim> Wow again!
<Eliezer> Raimondi moved to Atlanta to be closer to the Singularity Institute, so he's actually shown that he's willing to restructure his life for the Singularity, part of the reason why I've designated him as my backup on Singularity issues
<observer> (but what about falling refrigerators? Should all refrigerators have parachutes installed?)
<Ziana> hiya
<observer> (should we adopt the convention of putting all jokes inside parentheses?)
<Eliezer> 'scuse me while I fax everything to xtian
<Anand> Christian: Regarding funding for SIAI, do you others who may be helpful? *Wink-wink-nudge-nudge-know-what-I-mean*
<Ziana> wb chris
<Ziana> (observer- lol)
<crovner> Unfortunately, no
<Anand> Christian: OK
<Anand> Hi Brian
* MitchH wonders if SI can figure out a Konami-style password of winks, nods, and nudges to make its programmers let it out of a box.
<Eliezer> Christian Rovner, this is Brian, the President of the Singularity Institute
<crovner> Actually, I have a *very* hard time getting to explain the Singularity to all the people I know
<Brian> Hello
<crovner> Hello
<Eliezer> His decision to fund the Singularity Institute is how it all got started.
<Joachim> Yay!
<DmitriyM_plus_MikeLi> when was this chat announced?
<Eliezer> It was announced on SL4, yesterday
<DmitriyM_plus_MikeLi> oh
<Joachim> You know my Singularity t-shirt didn't come in time for the convention. :(
<Eliezer> My Singularity T-Shirt didn't arrive either, actually
<Eliezer> Still hasn't arrived
<Joachim> but i have singularity sneakers that i designed at customatix.com
<Joachim> they have brains on them, and say SIAI
<Eliezer> that reminds me: Christian Rovner, meet Jason Joachim
<crovner> Hi
<Joachim> Very nice to meet you
<Ziana> joa- interesting!
<Ziana> hiya
<Eliezer> Joachim volunteered to pass out flyers at the recent Worldcon in San Jose (Vernor Vinge was guest of honor)
<crovner> Ah yes
<observer> (one of the main things people have a hard time believing about the Singularity is the possibility of seed AI, and the ability of humans to program a seed AI)
<Eliezer> Christian Rovner, meet Sabine Atkins, one of the Directors of the Singularity Institute, and the person who designed SIAI's current website
<Joachim> I could have used some help. It's fantastic that youre on board
<Anand> Hello Sabine
<crovner> Hello Sabine
<sabine> Hello <smile>
<Eliezer> Ziana Astralos is designing SIAI's new website, and designed our new logo, which the Board of Directors recently adopted
<sabine> yeah!
<observer> (so there really is a board of directors...)
<Anand> (Incidentally, Ziana is also responsible for the Extropy Institute's website (www.extropy.org).)
<observer> statements in parentheses can be ignored
<observer> (oops, I didn't mean that in reply to anand's comment)
<Ziana> (lol)
<Anand> Mitch: Paper reads well to me. Good changes.
<MitchH> Anand. Thanks. I'll get those up then.
<Anand> k
<Eliezer> Brian, Sabine, and I are collectively SIAI's Board of Directors
<observer> oh
<Eliezer> Any sufficiently large policy decision would need to go through the Board
<Eliezer> you can reach all of us simultaneously by sending email to institute@singinst.org
<Joachim> letting the spam loose right ...now
<observer> (you forgot the parentheses)
<Eliezer> neither Brian nor Sabine volunteer much for day-to-day operations, but you'll know where to reach us if you have a policy issue
<Joachim> (oops)
<hookysun> or did he?
<hookysun> ()
<crovner> ok
<hookysun> (oops in()'s not encouraging)
<Eliezer> Mitch Howe is writing Singularity materials, and I think is still working on a Singularitarian FAQ
<MitchH> Yes, though temporarily stalled on that last front...
<MitchH> But I'm cranking out the introductory essays...
<Joachim> Thats got to be so hard.
<Anand> Indeed
<MitchH> Never has so much thought been put into saying so little.
<Eliezer> takes a lot of practice
<Eliezer> and even then it's still hard
<MitchH> Or so much, with so little...
<observer> btw, Eliezer, is it a bad idea to spread the fact that CFAI is incomplete?
<observer> (sorry for introducing a new topic)
<Eliezer> no, observer, just a fact
<Eliezer> specifies structure, not content, doesn't claim to do otherwise
<observer> ok, so whenever I talk about CFAI, it is a good idea to mention that it's incomplete?
<MitchH> Incomplete is such an ugly word. You may just want to refer to CFAI as a framework.
<observer> good suggestion, I guess
<Joachim> I got a buzz off this chat. Such community...
<observer> but it's a bad idea to claim that SIAI "knows" how to create a Friendly seed AI, or that they believe that they can create a Friendly seed AI?
<Anand> Observer: Clarify that Friendliness architecture is approx. complete, according to Yudkowsky, but not Friendliness content.
<observer> hmm
<Eliezer> Michael Roy Ames, who is currently AFK, is a Canadian programmer who helps support SIAI and is also interested in volunteering for any projects that fall within his purview
<Anand> Observer: No
<Eliezer> but he's AFK so I can't ask him what his purview is
<Eliezer> however, he is currently volunteering for an AI project called Novamente, not because he thinks it's a seed AI but to gain AI experience
<Joachim> wow, quite a disclaimer.
<outlawpoet> hm, in the interest of clarity, I'm also a volunteer on that project
<observer> (I also match that description, except for the name, and the part about Novamente)
<outlawpoet> MRA's involvement is rather minimal
<Ziana> that's the one Ben Goertzel is involved in?
<Eliezer> Is there anyone else here who would be interested in volunteering for Singularity projects of [type X] and who'd like to introduce themselves? Now'd be a good time...
<outlawpoet> he's largely just a member of the newbie mailing list, and a commentor on the documentation
<outlawpoet> much like my own involvement...
<outlawpoet> Yes, that would be Ben Goertzel's project
<outlawpoet> www.realai.net
* Ziana nods
<observer> Uh, I'd like to volunteer for proofreading Singularity documents and reviewing AI programming ideas...
<Anand> To clarify, I'll be fundraising for the remainder of '02.
<observer> (I invented a simple but interesting programming language called StrandC, which was part of a childish attempt at AGI during high school)
<Anand> Eliezer: There are issues I'd like to discuss once you're ready.
<outlawpoet> I suppose I could introduce myself...
<outlawpoet> My name is Justin Corwin, and I'm a singularitian
<observer> I'd eventually like to get involved in programming Flare, or possibly the AI itself.
<outlawpoet> (waits for the universal, "hi, justin")
<observer> (hi justin)
<crovner> Nice to meet you :)
<outlawpoet> ah well
<Ziana> i was about to say... ;-)
<observer> (yes, I meant to use parentheses)
<Eliezer> Good evening, Mr. Corwin.
<outlawpoet> thank you all.
<outlawpoet> as some of you may or may not know, I'm relatively new to the internet, as these things go
<outlawpoet> hence no website
<outlawpoet> I have spoken on the topic of AI and the coming singularity at several venues, to get a feel for it, and consider myself decently practiced as a public speaker
<outlawpoet> I've also been conducting some private research re: Seed AI safety precautions
<observer> (I'll post a better description of myself, my project, and how I plan to help the Singularity, in my "JOIN" post to sl4)
<outlawpoet> some of you may remember my AI-boxing experiments
<Anand> Justin: Care to give more detail on your public speaking experiences?
<outlawpoet> sure.
<Anand> Thanks
<Anand> http://www.fortune.c...r40/snap_3.html
<outlawpoet> When in jr. high, I became an anarchist. This being relatively important in my worldview, I participated in some debates at the local university
<Anand> http://www.fortune.c...40/snap_15.html
<outlawpoet> finding myself well qualified for the activity, I went on to be a state leader in Mock UN and Mock Trial Activities once I made it to high school
<outlawpoet> however, having no interest in being a lawyer, my further public speaking was limited to public debates of various kinds until earlier this year.
<outlawpoet> When I hosted a futurist gathering at the University of Utah
<outlawpoet> any utahns in here, by chance?
<outlawpoet> no?
<Anand> MitchH, I believe
<MitchH> yes
<MitchH> Provo
<outlawpoet> anyway, the most significant differences in speaking on the singularity is that you must be extraordinarily careful not to set off 'kook' alarms in people's heads
<MitchH> hehe. agreed.
<Joachim> I should think that a stranger speaking before a mixed audience about the singularity would be heckled mercilessly.
<outlawpoet> in some cases
<outlawpoet> hecklers are decently easy to deal with in a formal debate
<outlawpoet> in more informal settings...
<outlawpoet> well, I'm usually a better heckler than they are, so I just heckle them back..
<observer> (is it possible to talk about the Singularity - an sl4 topic - without setting off 'kook' alarms in normal people's heads?)
<outlawpoet> sure.
<outlawpoet> The important thing is the dwell on concepts more than consequences
<observer> right
<outlawpoet> smart folks can work out consequences themselves
<observer> (good point)
<Anand> My experience has been more favorable than not on this issue
<outlawpoet> and unsmart or hostile folks can say little if you're simply presenting concepts
<crovner> We should have a Talking-About-The-Singularity-HOWTO
<observer> good idea
<outlawpoet> yes, that would be a good idea
<outlawpoet> I'm willing to author or co-author on that one.
<crovner> We could use a Wiki page for that
<crovner> I have quite a few negative experiences to tell
<Joachim> ha
<outlawpoet> The most vital concept when talking about the singularity or ultratech topics is that you be absolutely certain about everything you say.
<Anand> Christian: Please share
<observer> (I can't help but otice the many parallels between Singularitarians and Christians, particularly the ICOC. Don't bother asking who the ICOC is if you don't know.)
<observer> (notice)
<Anand> Observer: Why not ask?
<outlawpoet> any logical inconsistency will hurt your overall case, even if it doesn't speak to your main topic.
<observer> (because it's off-topic)
<Anand> (International Churches of Christ. OK.)
<crovner> Anand: sure, I will (but they might be rather long for this chat)
<observer> (the ICOC is an amazingly well-organized christian cult.)
<Anand> Christian: Can you hit the salient pts?
<observer> (are you sure you don't want to insist on being called "Chris", or "xtian"?)
<Eliezer> outlaw: you mean, "Select the topics you talk about so that you talk about only those of which you are justly confident", not "Be confident in everything you say whether or not it's right", right?
<outlawpoet> correct
<Eliezer> I assume this is the case but would like to make sure it's clarified
<outlawpoet> even if you are pretty sure of something, if you don't know it's right, and WHY it's right, dont' say it
<outlawpoet> save it for a more personal setting
<crovner> Anand, observer: Yes, sure
<outlawpoet> whre disclaimers work.
<outlawpoet> where
<Eliezer> good point, outlaw
<Anand> Justin: Examples?
<Ziana> wb mrames
<Anand> Justin: From your experience
<outlawpoet> the timing of the singularity
<Eliezer> Incidentally, it looks like the "Introducing Chris" topic has been completed, so the chat is now open again...
<outlawpoet> I don't ever saying anything about that
<outlawpoet> because I can't defend it absolutely.
<crovner> Anand: Most people get sucked on their belief that creating AI is just impossible
<outlawpoet> I have my own opinions
<Ziana> [20:47] <Eliezer> Michael Roy Ames, who is currently AFK, is a Canadian programmer who helps support SIAI and is also interested in volunteering for any projects that fall within his purview
<Ziana> [20:47] <Eliezer> but he's AFK so I can't ask him what his purview is
<Eliezer> thanks Ziana
<Joachim> Chris: It's funny when you 'get it' and think you can just share it, but get nuthin'
<Ziana> np
<Joachim> Zianas a robot. yep.
<crovner> Yes, well, it's frustrating
<observer> (another parallel to christianity)
<observer> (should I stop pointing out parallels?)
<MRAmes> Ziana: Sure you can...I'm here :)
<outlawpoet> observer, surface similarities abound between meme x and meme y, but the main objections generally come from different areas
<Eliezer> I think that small hunter-gatherer bands of 200 people with no written literature may have tended to have rather smaller divergences in underlying assumptions
<Anand> Chris: It may be consistently difficult to change the view of someone with a little familiarity of how the concept has been used, and is presently used.
<Eliezer> which may mean that humans are (a) not good at identifying divergent assumptions that need to be addressed, in speaking
<Eliezer> and (B) that we are not good at adjusting to a speaker's divergent assumptions, we just assume ve's crazy
<Ziana> lol joachim!
<Eliezer> ergo a Singularitarian has to learn to address (a)
<Joachim> (i'm so happy to get an 'lol' from you!)
<Anand> Chris: There are clear points of reasoning and physical evidence that you can provide.
<Ziana> (joachim- would you like a certificate to frame? ;-) )
<crovner> To non-english-speaking people?
<Ziana> mrames- eliezer asked that 20min ago, while you were away... if you'd like to answer now, go ahead :-)
<observer> Have you found it hard to argue that it is possible for a human (or group of humans) to program a Friendly seed AI?
<Eliezer> you have to relearn, or learn, how to see your assumptions as salient according to the distance between those premises and the audience's current knowledge
<Eliezer> not salient depending on how sure *you personally* are of them
<outlawpoet> Eliezer has a good point there.
<outlawpoet> but I would add that attempting to determine your audiences assumptions is dangerous
<observer> (you might expect those points to be common sense)
<outlawpoet> particularly if there is a large cultural or memetic difference between you and them
<crovner> observer: I have found it hard to convince my mother that the human brain wasn't designed and created by God
<Joachim> ther you go...
<observer> (I don't expect to ever be able to convince my parents of that)
<outlawpoet> you run a very real risk of offending or miscategorizing someone.
* Eliezer says to observer: "Arguing something is one thing, getting people to believe a true statement for good reasons is another - FAI is complex enough that I sometimes, to my dismay, find people believing it for the wrong reasons."
<Anand> Chris: Why do you need to convince her otherwise?
<crovner> Anand: pardon?
<crovner> otherwise?
<observer> (oops, I should have thought about what I was trying to imply with the word "argue")
<hookysun> what's a wrong reason?
<outlawpoet> The most effective thing I've come up with, is to have a basic set of assumptions that you find a majority of humans ascribe to, worked out beforehand.
<Eliezer> I haven't tried to talk to my relatives much at all... there's no reason why I should have a higher priority for reaching them than for the other 6 billion current bystanders
<Joachim> I feel a real responsibility to others, to get them to see what I think is most Real
<observer> (that doesn't always work, though)
<Anand> Chris: Just curious why you've tried to convince her that the human brain isn't created/designed by God.
<outlawpoet> I'm always updating my list.. to try and make my prosletyzing more effective.
<hookysun> like thinking it'll be their genie?
<hookysun> or really wrong?
<outlawpoet> or their savior.
<observer> (the word "prosletyzing", another parallel)
<Joachim> Anand: I'm surprised at this line of questioning
<observer> ("savior", another parallel)
<MRAmes> Eliezer: I also have limited my explanations to relatives... concentrated mainly on the ones who are most receptive.
<outlawpoet> observer, not a parallel, a deliberate cross-use of religious language
* Eliezer says to hookysun: "Starting from the observation that most people object to Friendly AI because of inapplicable "Us. vs. Them" thinking, then reasoning that since racism/speciesism must be bad, its polar opposite must be good. This is roughly correct enough to be a good interim platform for reasoning but it's not actually a solid reason."
<observer> I know
<observer> but a parallel does exist
<outlawpoet> which is?
<crovner> Anand: because creating AI asumes evolutionary psychology
<observer> (oops, what did I mean by that?)
<observer> (sorry, I don't know.)
<observer> (mu)
<Eliezer> not really, observer, I think you need to get more distance between yourself and Christianity, right now you're seeing things as influenced by Christianity that don't have any Christianity in the mix
<outlawpoet> crovner, not neccesarily
<Eliezer> I did not have a Christian upbringing
<Eliezer> which is why I always find it interesting when people talk about my allegedly Christian influences
<crovner> outlawpoet: I mean real AI
<observer> sorry if I implied that
<observer> I didn't mean to (I think)
<outlawpoet> well, it's not a neccesary assumption
<observer> (maybe "subconsciously")
<outlawpoet> you could for example, believe that God's creation of humans is an existence proof of your capability to create one, for example
<Eliezer> well, the AI theory assumes evolutionary psychology and the FAI theory *definitely* assumes evolutionary psychology
<Eliezer> not necessarily all possible AI theories
<hookysun> crovner: i doubt it
<Eliezer> but definitely SIAI's
<hookysun> how does ai assume evopsy?
<crovner> I agree, not necessarily all theories
<outlawpoet> SIAI, superintelligent AI, or singularity institute AIs?
<Anand> At some point, I'd like to discuss a Singularity Activists community website, or SIAI community website, or something to that effect.
<Eliezer> Hookysun, I would describe my AI theory as a branch of evolutionary psychology masquerading as AI, so I definitely agree with Chris here
<Eliezer> SIAI, SingInst
<outlawpoet> anand, do you have one ready to publish, or more in the character of 'what to make?"
<Eliezer> Anand, with Chris on board and real volunteer activities going on, I think it's time for an siai-vol list
<observer> (ironicly, the computer I'm on right now is in the office of a christian student group on campus. I'm using this computer because it's the only computer I have access to that can successfully run IRC)
<Anand> I can help with content, but not with design.
<Eliezer> the problem is that the SL4 mailing list is having hosting problems, such as the archives and the spam bounces, which Kia.net doesn't seem able to fix
<MRAmes> Responding to Eliezer's (implied) question about my purview: Currently funding and cheering - but...
<Eliezer> which makes me sorta reluctant to start up another kia-hosted mailing list at this time
<Eliezer> I don't like YahooGroups but I don't see much of a choice right now
<outlawpoet> well, there are personal mailing list programs
<outlawpoet> does anybody have a computer on the net 24/7
<crovner> A friend of mine is probably going to get free hosting soon, and he'll share it with me
<Anand> There is some PR-related material I'd like to see at a site.
<outlawpoet> with a little bit of space on it?
<Eliezer> it'd have to be solidly 24/7, though
<MRAmes> (response cont.) I have many years of programming/leading experience, and will contribute much more time/expertise when the time is right.
<Eliezer> no outages
<Anand> Sorry, there are
<outlawpoet> yeah, hence I'm out, (cable internet, quite flaky)
<Anand> A separate community website may be appropriate, so as to not overload SIAI's site.
<outlawpoet> I have access to an osx box on a business network, but it's not under my direct control so i hesitate to suggest it.
<observer> (I had to spend two hours pretending to be a christian to get access to this computer, though it may have been possible to find another way to join this chat)
<Brian> Is kia.net not able to fix it due to time constraints or they just plain don't know how?
<Eliezer> I don't know
<Joachim> Centralizing volunteer material on the SIAI site would be best. Wouldn't it? It would be encouraging to new prospects...
<Eliezer> I need to call them up and ask if they're overworked, or whatever
<Brian> Perhaps we should look into actually paying them
<Ziana> i'd suggest asking David McF.
<Eliezer> interesting point, Ziana
<hookysun> maybe they're sitting on it?
<Eliezer> do you know what extropy.org runs on?
<outlawpoet> yeah, david runs it
<Eliezer> no, I mean, is it a dedicated boxen at a cohosting location, or what?
<outlawpoet> ah, no, single point of failure
<outlawpoet> under david's direct control
<Ziana> eli- i *think* that's the case, but i've never really asked
<outlawpoet> when it goes down, it goes down, all of it
<outlawpoet> irc servers, websites, etc.
<Ziana> that rarely happens
<outlawpoet> true
<outlawpoet> but it has happened twice in my experience, which is how i know it's all in the same place
<Ziana> yes, it's all one server
<outlawpoet> plus sideways discussions with lucifer regarding the future of the virus website
<Ziana> but the last time was months ago and it wasn't a server problem, it was the ISP having router issues
<outlawpoet> i remember.
<observer> (hmm, silence...)
<outlawpoet> I'll check and see if there are any alternatives to yahoogroups.
<Anand> Be back in 20 mins
<Ziana> nothing good without paying for it
<Ziana> (whereas david would *possibly* be willing to host it for free)
<outlawpoet> hm
<Eliezer> actually, if it's a reliable dedicated boxen, I can see SIAI and ExI clubbing up to pay for it
<Ziana> can't say for sure, of course ;-)
<Eliezer> if that would let us create and run our own Web software
<Eliezer> give people logins so they could control subsections of SIAI
<Eliezer> and so on
* Ziana nods
<Ziana> david's server can do that already...
<observer> (hmm, I may have just realized why it would be a bad idea to use the statement "an AI is basically a logic machine" in an argument about whether Friendly AI is possible. I could be wrong, though...)
<Eliezer> well, yeah, it's not a true statement
<Eliezer> or at least is not particularly more true of AIs than of humans
<observer> (the reason is that a reasonably advanced Friendly AI won't be just "a logic machine", it will be a "mind in general")
<Eliezer> or of proteins
<Eliezer> *no* AI is *ever* a logic machine, not even at the beginning
* xtian sighs
<Ziana> wb again
<xtian> thks
<observer> so "logic machines" shouldn't be called AI's
<hookysun> x!
<hookysun> (get it? "christ!")
<observer> (heh)
<Joachim> agh
<Joachim> Coordinator X!
* Eliezer says to xtian: "Don't worry, they make fun of me too."
<hookysun> no, observer
<observer> ?
<xtian> Heh
<observer> (His name begins with the letters "Eli")
* xtian gets it late
<outlawpoet> only because Eliezer won't come and part the great salt lake for me, despite being an obvious old testament figure.
<observer> (as in "the great prophet Eli")
<outlawpoet> another amusing episode was when Nicq Macdonald assumed Eliezer was an older fellow, simply because of his name, after Eliezer Google-spoofed him
<Ziana> lol
<outlawpoet> i think he called him "an omniscient old man"
<Ziana> lol!!
<observer> (wasn't the Eliezer of the bible just a regular person who happened to be a friend of abraham (or was it one of abraham's kids?)? I should know this...)
<outlawpoet> yeah, Eliezer was a minor figure in the bloodline of christ.
<outlawpoet> but it's so much more fun to equate him to Elijah or similar
<Eliezer> you're thinking of Elijah the prophet, observer; Eliezer was a household slave of Abraham
<Eliezer> and...
<Eliezer> NONSL4
<outlawpoet> bwa ha ha ha ha!
<xtian> Heheh
* hookysun nods agreement
<observer> right, sorry
<hookysun> i can volunteer part of the time
<hookysun> shall i intro or whatever?
<outlawpoet> yeah, hooky, give us a rap bout yourself
<observer> so, do you believe you currently know enough about seed AI and Friendliness to be able to create a Friendly seed AI given an infinite amount of computing power?
<observer> (I was asking Eliezer)
<outlawpoet> (rap as in discussion a la 60s, rather than rap like hiphop rhyme)
<hookysun> i can't dedicate all my time cause i have a child to support
<Eliezer> observer, I think so, but it would be very dangerous to play around with real infinite computing power
<Eliezer> but in the spirit of your question... yes, I think so
<Eliezer> though there are specific FAI questions which I do not know the answer to, but which I expect I would learn in the course of creating AI
<outlawpoet> hey, you never take my questions in the spirit in which they're given!
<Eliezer> such as the skill of specifying unambiguous references to the external cause of an effect the AI can observe
<observer> so now it's just a question of how to do it with a finite amount of computing power... (?)
<hookysun> of course i would consider cfai a suitable form of child support, tho' the courts would be skeptical
<observer> heh
<Eliezer> teaching an AI to find the referent of ambiguous communications using speaker motivation assumptions and mapping based on similarities
<outlawpoet> dude, how are you connecting to this session?
<xtian> I'm in a cyber-coffee :p
<Ziana> three times in less than six minutes... going for a record here? ;-)
<Eliezer> distributing extrapolative heuristics and corrective heuristics across a content base
<outlawpoet> aren't you the adventuresome one.
<observer> would it be a good idea to give the AI a "completely unambiguous language"?
<Eliezer> other questions of FAI content not covered in CFAI
<Eliezer> no such thing, observer, sorry
<outlawpoet> hm
<observer> wow you type fast
<Joachim> I'm looking for ways to volunteer as well.
<Eliezer> the miracle of Dvorak
<xtian> Do you use a Dvorak keyboard=
<xtian> ?
<outlawpoet> oh yeah, miscellaneous plug, if anyone here has a personal computer, immediately go and buy a Dvorak keyboard.
<observer> (wow, I should have expected the great Eli to use a dvorak keyboard...)
<outlawpoet> Even I can use it, with my poor micromotor skills
<hookysun> i'm not in school because i'm not pursuing a degree
<xtian> Sorry for the off-topic question
<outlawpoet> plus it protects against Repetitive Stress Disorder
<outlawpoet> which is a crippling and nasty ailment
<BJKlein> i'm learning dvorak :)
<observer> (I didn't know that it prevents RSD... that's worth knowing. thanks)
<outlawpoet> My only complaint is that it clicks loud, and wakes up my girlfriend If i'm really pounding on it.
<MitchH> (but you don't need a Dvorak keyboard to learn Dvorak if you already know how to type properly)
<BJKlein> heh..i have the same problem justin
<observer> hmm
<xtian> You'll need a lot of stickers
<observer> is a dvorak keyboard a different key arrangement, or just a differently shaped keyboard with the same arrangement?
<outlawpoet> depends
<outlawpoet> mostly it's just an arrangement
<outlawpoet> but many dvoraks also incorporate split style key placement
<hookysun> what degree would be commodious to the singularity?
<Eliezer> I don't have a hardware Dvorak keyboard, I just memorized the layout and switched the software
<MitchH> (if you already know how to type correctly you're not looking at the keys)
<Eliezer> took a couple of weeks, but after that, voom
<MitchH> same here
<observer> (!!!)
<outlawpoet> it's pretty simple, just open up a word processor and type until you get it right.
<observer> (Eliezer stuns us all again)
<Ziana> lol
<Eliezer> this is pretty trivial, observer
<MitchH> there is a good practice site online that I used to learn the keys
<Eliezer> I know it sounds impressive, but it's not
<outlawpoet> oh, come on, observer, when was the last time anybody looked at their keyboard?!!?
<hookysun> earlier i've said i'd present whatever curriculum seemed to work, but how would i measure success?
*Eliezer* How much time did you spend looking to find powweb.com?
<observer> (uh, just as I am typing this...)
*Eliezer* Is it optimal?
<MitchH> My only beef since switching (and not using stickers) is that hotkeys are rarely in convenient locations.
<outlawpoet> right... right...
<outlawpoet> never mind.
<Joachim> I'm fantasizing about being in a cyber-coffee in Buenos Aires...exotic.
<outlawpoet> heh heh heh.
<MitchH> http://www.karelia.com/abcd/abcd.html Go ahead. Learn Dvorak.
<observer> (I feel even more unworthy now...)
<outlawpoet> joachim, i was doing the same thing a moment ago
<observer> (to be in the presence of the great Eli...)
<outlawpoet> observer, that's really kind of wierd how you do that.
<Eliezer> Hm...
<observer> (you caught the disclaimer about parentheses, right?)
<Ziana> lol
<MRAmes> w b
<Eliezer> I suddenly noticed one of my assumed shared assumptions that I don't actually know to be shared
<outlawpoet> what's that?
<MRAmes> what assumption?
<observer> yeah, what assumption?
<outlawpoet> which! which!
<Joachim> yeah?
<xtian> ?
<Eliezer> observer, Singularitarians tend to be interested in evolutionary psychology, for purposes of debugging their own psyches to run in the 21st century instead of hunter-gatherer tribes
<observer> I know I should be interested...
<MRAmes> Fits to me, yeah... go on.
<xtian> Fits on me
<Eliezer> one of the concepts we tend to be particularly unfond of is that of "tribal chief"
<MitchH> ditto
<Joachim> same here
<outlawpoet> heh.
<MRAmes> huh?
<MRAmes> Not!
<Eliezer> I am not the chief of the Singularitarian tribe
<outlawpoet> I use the memeobject "guru" for that one, i think.
<Eliezer> there is no chief
<MRAmes> Oh.... sorry.... take a negative on that!!!!
<observer> but... you're the most important person on the planet...
<MRAmes> I *am* unfond of the tribal chief idea!
<outlawpoet> hm.
<xtian> I don't trust on *any* kind of authority
<MRAmes> observer: not any more.
<observer> ?
<Eliezer> in some of the various possible linear orderings of Singularitarians by, e.g., demonstrated knowledge of AI, and so on, I might come out first, but this does not imply the activation of those hardwired emotions having to do with social status, dominance, and so on
<MRAmes> observer: not since he published his great ideas about FAI and SIAI.
<outlawpoet> not only that, but it's not as if we know future history
<hookysun> i'll do a cogent, clear intro as a JOIN for the mailing list
<Eliezer> just that if you rank any group using a linear ordering, someone will end up first, whether or not that means anything
<Eliezer> it applies to minds-in-general, including those with no support for human dominance/social emotions
<observer> sorry, I guess I'm not at sl4 yet. I expected that.
<MRAmes> The 'importance' of Eliezer is that he had the ideas.
<Eliezer> that's the kind of "social" role that I, personally, try to achieve in my dealings
<Eliezer> to be the same person I would be if we were all minds-in-general
<Eliezer> not humans with hardwired ideas about tribes and dominance
<observer> hmm
<xtian> I agree
* MitchH informs the room (those reading during conversational lulls, at least) that his 3 new essays are up at http://www.iconfound...ademy/works.htm
<MRAmes> observer: now that the ideas are shared, others can 'run' with them.
<observer> sorry for making you waste your time to explain all this...
<outlawpoet> thanks mitch
<MRAmes> no prob.
<MitchH> sp
<MitchH> np
<Ziana> wb
<xtian> wb
<Anand> Thx
<observer> I have a lot of mind-restructuring to do...
<MRAmes> don't we all
<MRAmes> :)
<xtian> I'm afraid I'll have to make a (probably silly) remark every minute or so, in order to not get disconected
<Eliezer> ah
<Eliezer> I suppose that's one way to keep you involved in the conversation
<observer> (remember to use brackets if it's too silly)
<observer> (I mean parentheses)
<xtian> (cool)
<Anand> Could someone send a log of the chat so far? trans_humanism@msn.com
<hookysun> ok
<Anand> Thanks!
<Eliezer> Anyway, observer, that's the reason why I tend to be uncomfortable around "Eliezer is the Messiah" jokes, because of the tribal chief attractor in human thought
<xtian> Are there archives of these chats?
<Eliezer> not yet
<outlawpoet> really? I thought it was just because you were uncomfortable with hero-worship.
<Eliezer> the way in which I personally organize my own debugging is by learning active dislike for the hardwired emotions that I believe are irrational, counterproductive or outright evil
<observer> should I have waited until I was really at sl4 to join this chatroom? or perhaps joined anyways and just kept quiet until then?
<outlawpoet> eh
<outlawpoet> it's not as if all of us are at the same qualitative level.
<Joachim> We don't need a tribe, we want a Singularity.
<xtian> Anyway, most of the people who would consider you a Messiah, just can't understand you
<Eliezer> one of the very earliest hardwired things I learned to dislike - to flinch away from - was the concept of personal importance / tribal chief / dominance
<Joachim> Observer: what are you looking for?
<observer> uh, the Singularity (specificly, Friendly seed AI)
<Eliezer> I have roughly the same reaction to that, when I perceive it, as you would to a bug crawling up your arm
<Eliezer> "ugh - squish"
<observer> sorry
<Eliezer> that's how personal debugging works for me
<observer> I must really annoy you
<xtian> .
<observer> (no, I'm not stating that annoying you is an important goal, in case that was ambiguous)
<Joachim> nah, as far as newbies go your just fine
<Eliezer> well, whether you annoy me is not always the point, but in this case, yeah, it gives me the crawling shivers to see other people saying these things, because that's how I trained my mind to react to those statements when they are produced internally
<xtian> .
<Anand> Chris: Could you restate whether your full-time with SIAI?
<observer> Okay, I'll try your strategy.
<outlawpoet> Eliezer, have you considered alternatives to aversion?
<observer> Even though unpleasantness is, uh, unpleasant...
<Eliezer> no, outlawpoet, this works very well
<Eliezer> I don't want to mess with what works unless I see something better, and I don't
<observer> "It's for the Singularity"
<Joachim> hm
<observer> Is that an acceptable statement to use?
<Joachim> is it honest?
<crovner> .
<MRAmes> Aversion is quite effective, and not really negative/descructive.
<Joachim> Crovner: try 'ping?'
<observer> honest as in not a rationalization?
<crovner> ping
<outlawpoet> hm, I can understand that.
<crovner> "/ping"?
<MRAmes> So it is not really a 'sacrifice'... just reprogramming for efficient operation.
<Anand> Yes
<observer> And sl4 already has plenty of jokes, we don't need any Eliezer-worship jokes.
<outlawpoet> It's just interesting to me. I've never tried using aversion, I've always been afraid to, for fear of losing useful information
<MRAmes> Its reversible.
<observer> really?
<outlawpoet> instead I use an associative mindstate, when I catch myself falling into a thinking error, I default into a neutral reflective mindset
<Joachim> I expect that Eliezer was only able to 'use aversion' when he became grossed out by...i dont want to say it...irrationality?
<Ziana> bad timing on that disconnect... right as i was about to say, i'll be going now ;-)
<MRAmes> absolutely... you probably can think of examples from your own life where you have reversed an aversion,
<outlawpoet> yeah, i have a few examples.
<outlawpoet> hm.
<MRAmes> outlawpoet: 'neutral reflective mindset' Great, when you can do it :)
<outlawpoet> heh.
<outlawpoet> yeah, i suppose that's true.
* Ziana waves
<observer> I guess I should also avoid statements that aren't intended as Eliezer-worship, but might be misinterpreted as such
<outlawpoet> I just call it my debugging IDE
<outlawpoet> it's an attitude dominated by the concept that I'm a general intelligence running on hostile hardware
<MRAmes> heh
<Anand> Chris: Do you use AIM, Windows Messenger, Yahoo Messenger, ICQ?
<outlawpoet> it's not quite a correct concept, but it helps with motivation.
<Anand> If not, I suggest getting one
<Joachim> Outlaw: thats beautiful
<hookysun> chris = xtian?
<Eliezer> that's going in my quotes file, outlaw
<Anand> You can use them all through Trillian (www.trillian.cc).
<MRAmes> outlawpoet: so... you refer to yourself as a 'general intelligence' too? Ha. I thought it was only me. (and Eli maybe)
<xtian> Presently, I have no Internet at home. So I'm in a cyber-coffee. But I'll hire an ISP within this month
<outlawpoet> aww, you guys.
* outlawpoet would blush.
<Anand> Chris: k

1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users