• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans

Photo
- - - - -

Chat For Sun Nov 3rd 2002


  • Please log in to reply
No replies to this topic

#1 Bruce Klein

  • Guardian Founder
  • 8,794 posts
  • 242
  • Location:United States

Posted 04 November 2002 - 10:31 AM


<BJKlein> Welcome
<BJKlein> Official Chat Begins Now
<BJKlein> Topic: Existential Risks
<BJKlein> Immortalists take great interest in limiting all risk to life... but, I'd like to focus on Existential Risks in this chat tonight.
<BJKlein> One of the best papers on the subject is by Nick Bostrom: http://www.nickbostr...tial/risks.html
<BJKlein> This topic is often overlooked because it's usually a clear cut argument.. for example: War is bad and asteroids kill. Thus, not much to discuss... but, I'd like to touch on the topic tonight in general terms at least and hopefully discuss more specific risks at most.
<BJKlein> While humans are living longer and learning how to protect life and health with greater efficiently, we're still open to catastrophic threats from a wide range of effects. I'll list some now and we can discuss more.
<BJKlein> Existential Risks: (in no specific order)
<BJKlein> 1. Environmental Risk (Global Warming/Virus)
<BJKlein> 2. Global War/Terrorism (Biological/Nuclear)
<BJKlein> 3. Threats From Space (Asteroids/Aliens)
<BJKlein> 4. Run Away Tech (Gray Goo/AI)
<BJKlein> Ok, we're open for discussion...
<BJKlein> Thanks!
<MichaelA> Ok
<BJKlein> Do you have a risk that you'd like to share?
<MichaelA> Yes
<MichaelA> Eclectic Psuedoplague
<MichaelA> made up by Mitch Howe
<BJKlein> Uploading Idiots?
<MichaelA> it is nanomachines, or rudimentary microdevices, used in coordination with cheap bioweapons
<BJKlein> ahh..
<MichaelA> Uploading idiots, let's see
<MichaelA> It's hard to say what constitutes an "idiot"
<MichaelA> Selfish uploads
<BJKlein> It seems that run-away tech will be the biggest threat.... in my mind, in general...
<MichaelA> I agree
<BJKlein> man made that is
<MichaelA> Yeah, natural threats have been around for a long time
<BJKlein> asteriods seem to be the most likely natural event
<MichaelA> Um, I dunno
<MichaelA> What about semi-natural stuff like global warming?
<BJKlein> good point
<BJKlein> ice ages have killed many
<MichaelA> Asteroids happen really really rarely
<BJKlein> but the scope is much wider
<MichaelA> Not usually
<MichaelA> most asteroids are tiny
<BJKlein> or rather the impact is more deadly
<haploid> Global warming is a slowly progressive problem. It is easy to prepare for - quite unlike asteroids.
<BJKlein> impact... heh
<BJKlein> 65 million years is a long time.. but
<BJKlein> we're due for another of that size soon
<MichaelA> The hardest thing to prepare for is probably unfriendly AI
<MichaelA> Because the only defense is either no AI research or benevolent human-equivalent AI
<BJKlein> BTW, the "Guest" is my dad!
<BJKlein> hi dad!
<Guest> hi
<BJKlein> ;)
<MichaelA> Hello Mr. Klein
<Guest> hello michael
<BJKlein> I'm talking with mom on the phone.. she says she is coming up next weekend
<Eliezer> wow, she must live a long way from this chat channel
* MichaelA snickers
<Guest> b'ahm vs fairhope
<BJKlein> about.... 4.5 hrs drive :)
<MichaelA> Well let's see, trying to get back on subject, heh...
<BJKlein> yes..
<Guest> talking about risk that's driving on the interstate
<BJKlein> sorry
<BJKlein> lol
<MichaelA> Yeah
<MichaelA> For personal death, driving is actually the most likely way to die
<Guest> true
<MichaelA> oh yes
<MichaelA> Bathtub of antrax
<MichaelA> Or, swimming pool
<MichaelA> Wasn't there a James Bond movie about that? hah hah
<Guest> tell mom she needs to let you go
<BJKlein> heh.
<Guest> there's a bond movie on tonight
<MichaelA> lol, which?
<BJKlein> she's telling me "important" stuff
<Guest> i think the last one
<MichaelA> brb in a few mins
<urgyen> :-) so is this,, existential risk, the difference between acknowledging that which relies upon a participative contribution as opposed to that which acts independently or not?
<urgyen> missing the point in life is a high risk
<urgyen> so therefor an attempt to quantify quality..
<urgyen> establish categorical resonance
<BJKlein> ok.. back
<MichaelA> me too
<BJKlein> urgyen, welcome.. and yes... existential, in terms of a risk that is on a large scale and out of individual control
<Eliezer> existential risk: see http://www.nickbostr...tial/risks.html
<MichaelA> Not necessarily out of individual control
<Guest> talk to you tomorow love dad
<urgyen> but doesn't necessarily have to be out of an individual's control
<BJKlein> well, true.. but in general
<BJKlein> on a more global scale...
<BJKlein> one person can push the buton
<BJKlein> button
<BJKlein> but that nuke will kill millions
<MichaelA> Oftentimes mentioned with existential risk is the fact that technology is allowing us to do more and more but humans are essentially staying the same
<Eliezer> that doesn't define an existential risk, BJKlein - it has to be something capable of actually wiping out the human species
<urgyen> hmn
<BJKlein> as Bostrom puts it... yes.. but one nuke can lead to a total loss
<Eliezer> I think Bostrom was right to focus on proximal causes of existential risks
<Eliezer> the most likely underlying causes of existential risks is a separate issue
<urgyen> the fate of cognitive lag.. where beliefs have yet to catch up with current understanding..
<urgyen> so an element of unecessary annihilation is required
<MichaelA> Not required
<urgyen> to give it that 'angst'
<MichaelA> 'An undesirable side effect'
<urgyen> no? just blatent stupidity is enough
<urgyen> but.. one would not be undermining something at a level of condition itself
<urgyen> not proposing the deconstruction of cause and effect
<MichaelA> The analysis of cause and effect
<MichaelA> Of existential risks
<MichaelA> When you go down to underlying human causes
<MichaelA> Gets tainted with political thinking, in almost all cases
<MichaelA> It's harder to pin down where the risks come from
<MichaelA> The dangerous aspects are possibly pan-human
<BJKlein> Political thinking is based on the human will for power..
<nrv8> but but
<nrv8> simpsons
<nrv8> :)
<BJKlein> EyPsy revisited
<BJKlein> EvPsy
<MichaelA> Partially
<MichaelA> It's based on adaptations
<urgyen> well,, I was drifting along basis of extending condition.. identifying in such a way that would be immune to political defintions that are harmful.. is that where this discussion is going as well?
<MichaelA> 'Extending condition'?
<BJKlein> every read James L Halperin's "The Truth Machine"?....
*** Disconnected
-irc.lucifer.com- *** Looking up your hostname...
-irc.lucifer.com- *** Checking Ident
-irc.lucifer.com- *** Found your hostname
-irc.lucifer.com- *** No Ident response
-irc.lucifer.com- *** Notice -- motd was last changed at 7/3/2002 12:03
-irc.lucifer.com- *** Notice -- Please read the motd if you haven't read it
-Global- [Logon News - Mar 15 2002] To obtain help on the system, client software or to locate an operator please join #operhelp
-NickServ- This nickname is registered and protected. If it is your
-NickServ- nick, type /msg NickServ IDENTIFY password. Otherwise,
-NickServ- please choose a different nick.
*** Attempting to rejoin...
-NickServ- Password accepted - you are now recognized.
*** Rejoined channel #immortal
<MichaelA> Conditions required to prevent existential disaster..?
<urgyen> the condition that allows a 'point of life'
<BJKlein> urgyen... point of life?
<urgyen> would tend to also prevent existential disaster, yes
<MichaelA> It might help
<urgyen> point to life?
<BJKlein> are you talking about a reason to live?
<MichaelA> A meaning of life?
<urgyen> sure.. but not in any 'strict' sense
<MichaelA> You think the lack of a meaning of life contributes to existential risk?
<urgyen> like overly rigid definition requiring life to be biologic, etc.
<urgyen> I demand that, Michael
<MichaelA> I think that partially, barely contributes to it maybe
<urgyen> :-)
<MichaelA> What about a bunch of people that really like life but they accidentally cause an existential disaster?
<MichaelA> That's a case where you have people with a meaning of life but death happens nevertheless
<urgyen> I am trying to 'extend' the 'condition' to not allow accidental discontinuity
<urgyen> a non death scenario
<urgyen> like, perhaps, after you 'die,' 'condition' is still present
<BJKlein> are you talking about cryonics?
<MichaelA> The condition of wanting to live?
<MichaelA> Whoa whoa, wait one second, let's clarify what you're saying
<urgyen> :-)
<urgyen> no
<MichaelA> What is the initial condition, and how do you propose to extend that?
<urgyen> basis is currently not defined
<MichaelA> I think I may be starting to understand you a bit
<urgyen> but that's ok
<BJKlein> or transhumanism..
<urgyen> it would derail the conversation regarding existential risk
<MichaelA> The basis can stay undefined as long as I have some clue what it is
<MichaelA> We're already sort of derailed from existential risk, but that's ok
<BJKlein> not a problem... we'll get back after we catch what your talking about
<urgyen> it is inferred in 'existential,' correct?
<MichaelA> Nope
<urgyen> no?
<MichaelA> You've totally got the meaning wrong
<urgyen> :-)
<BJKlein> lol
<MichaelA> An "existential risk" is a disaster that threatens the life of the human race
<MichaelA> It has absolutely nothing to do with existentialism
<urgyen> requires this thing called 'human' then.. but that almost sounds like the same thing I am talking about
<MichaelA> http://www.nickbostr...tial/risks.html
<BJKlein> a definition of existential may help...
<urgyen> I did have that page up
<urgyen> you want to only talk of a narrow band of this existential risk thing, then..
<urgyen> I'll slow down again and watch
<MichaelA> We don't have to
<MichaelA> No, we can continue with what you were saying
<urgyen> 'k
<urgyen> basis.. is that which supports, and then it gets vague
<urgyen> like, life, intelligence, consciousness, cognition.. what?
<urgyen> you said human
<urgyen> that's cool but it also has a spectrum
<MichaelA> That which what supports?
<urgyen> this spurious activity that allows us to exist
<BJKlein> so urgyen.. you'd like to widen the scope of "life" we're trying to save?
<urgyen> sure
<MichaelA> Urgyen, in your mind, what's your definition of "existential risk" right now?
<urgyen> if you can establish 'identity' isn't the same no matter where you find it?
<MichaelA> I am envisioning millions of nuclear explosions or runaway nanotech
<urgyen> s/isn't/isn't it/
<BJKlein> urgyen: is english your second language?
<urgyen> I had to stretch to arrive at a sense of existential risk.. it was fun
<MichaelA> Urgyen, I'm not sure what identity exactly has to do with planetwide destruction, which is what existential risk is
<urgyen> not
<BJKlein> just curious...k
<urgyen> loss may be greater than the sum of benefit from having a humanity
<BJKlein> here's an idea....
<BJKlein> a little related to our topic
<BJKlein> James Halperin's "The Truth Machine"
<BJKlein> a world in which no one can lie... anyone read that one?
<urgyen> I don't think so
<BJKlein> He also wrote "The First Immortal"
<urgyen> but, ok, when a truth, even if a lie, can become a truth then
<BJKlein> good good book...
<MichaelA> YNo
<MichaelA> No*
<MichaelA> What happened in it?
<BJKlein> a device is made that one can wear as a ring..
<BJKlein> and it will tell you if someone is lying
<BJKlein> would this not turn the current world upside down?
<BJKlein> imagine politicians, criminals, etc.
<urgyen> trust is important
<urgyen> keeping our promises allows us to communicate
<BJKlein> but.. why rely on trust when you can have the truth!
<urgyen> :-)
<urgyen> truth is an approximation
<MichaelA> That wouldn't jive well with the current human nature
<BJKlein> sure.. but the current approximation is skewed way off
<MichaelA> We can't decide not to lie just because there's a machine out there
<MichaelA> It might help though
<BJKlein> the ring would move it back to center a tad.. lol
<MichaelA> 'To center'?
<MichaelA> Like 'closer to our idealized world'
<MichaelA> ?
<BJKlein> yes
<BJKlein> whatever idealized is
<BJKlein> it might not be that great
<BJKlein> kinda like a stock market with perfect knowledge...
<BJKlein> where noone could make any money
<urgyen> I guess I was proposing that the 'end' could not happen for those 'not missing the meaning of life'
<BJKlein> because everyone would have perfect information
<urgyen> and missing or not, is related to the detectability of truth
<BJKlein> what is the "meaning of life"...
<BJKlein> as far as im aware.. there is none
<MichaelA> Urgyen, why not, what is you got nuked? There's the end, and regardless of whether you have a meaning of life or not, you're dead.
<urgyen> maybe
<urgyen> that's an assumption
<urgyen> what if the condition that brings this 'life' can be extended along a different but nearly identical basis?
<MichaelA> Hm
<urgyen> and what if I said this isn't -theory- ?
<MichaelA> What is it then
* urgyen hides his truth ring
<BJKlein> lol
<urgyen> well,,,
<urgyen> from the schools of thought that I study, this is the point
<MichaelA> What schools of thought do you study, if I might ask
<urgyen> this escaping the boundaries of limited existence
<urgyen> nyingma, dzogchen, vajrayana, prasangika madhyamika
<urgyen> they have something they call 'rainbow body'
<urgyen> and the descriptions are similar to how I am attempting to present here
<urgyen> the basis is called 'recognition'
<BJKlein> susan is here!
<urgyen> :-)
<MichaelA> How would those schools of thought deal with a biochemical weapon released to kill everything on the surface of a planet?
<MichaelA> Hi Susan!
<urgyen> oh, something like "Oh Shit!"
<urgyen> ;-)
<BJKlein> hola everyone...
<urgyen> hola
<urgyen> we would have to be suffering from a lack of recognition for that to be possible
<MichaelA> 'we'?
<urgyen> so you are proposing that this recognition can never reach something of a 'near perfection'
<MichaelA> recognition of what?
<urgyen> we as humanity
<MichaelA> Respect for life?
<MichaelA> Lack of recognition of what?
<urgyen> recognition of the nature of the mind
<MichaelA> Ahhh
<MichaelA> Now we're getting somewhere
<urgyen> they don't have the standard body mind separation tho
<MichaelA> Yes, recognition is important
<urgyen> that's why I kept introducing 'condition'
<urgyen> cause and effect is still a prerequisite of recognition
<MichaelA> Who doesn't have the standard body mind separation?
<urgyen> but those triggers are not limited or restricted to biological life form
<urgyen> well, mind, people get the idea of a ghost in the machine thing
<MichaelA> Ah
<urgyen> there is basically only machine
<urgyen> but it's not necessarily limited
<MichaelA> I seee
<MichaelA> You're correct
<MichaelA> What do you think these "triggers" are?
<MichaelA> Is it more accurate to think of them as tendencies, or neurological patterns?
<urgyen> vocabulary is fun
<BJKlein> thanks for joining the forums urgyen...
<urgyen> thanx
<BJKlein> hope to see more of you
* BJKlein steps out for a few
* MichaelA will be back in a while also
<urgyen> I think things go into a bit of detecting errors of logical typing...
<urgyen> at this point
<urgyen> where ppl accidentally take metaphor for truth
<urgyen> or.. it is truth.. but not literal
<urgyen> and that takes a bit of practice to flex into
<winnipeg> Topic for tonight is?
<Utnapishtim> hey
<Utnapishtim> DAMN ITS QUIET TODAY...
<Utnapishtim> chat didn't last long?
<urgyen> making dinner, bbiab
<urgyen> two main principles took a break
<urgyen> so to recap. topic was introduced as Existential Risk, from: http://www.nickbostr...tial/risks.html
<urgyen> where I attempted to re-establish intention as a necessary quantification




1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users