• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans

Photo
- - - - -

Chat For Dec 8th 2002


  • Please log in to reply
No replies to this topic

#1 Bruce Klein

  • Guardian Founder
  • 8,794 posts
  • 242 â‚®
  • Location:United States

Posted 09 December 2002 - 08:48 AM


<BJKlein> and a hush falls over the room
<Gordon> just 30 seconds
<Discarn8> <-- kicks off the Jeopardy theme
<BJKlein> I'll post a few lines and then open her up...
<BJKlein> Hello.. Welcome Guests and Returning members.
<BJKlein> Official chat starts now;
<BJKlein> Topic: Rationality.
<BJKlein> So why discussion this topic? Well, I haven't a clue
<BJKlein> or maybe I do, but I'm finding it hard to put together
<BJKlein> a coherent or dare I say rational reason for discussing this topic...
<BJKlein> Ahh, but there is always Google. She is the friendly Labrador I've always wanted.. "Fetch Goggle.. Good Dog! What did you find?"
<BJKlein> Woof!
<MichaelA> heh
<BJKlein> From: Center for the Study of Rationality
<BJKlein> http://www.ratio.huji.ac.il/
<BJKlein> "Rationality here means that decisions are made in a consistent manner to maximize benefit (utility). Interactive decision theory (also known as game theory) studies what happens when rational agents with different goals interact, each making its own decisions on the basis of what is best for itself,
<BJKlein> while taking into account that the others are doing the same. These ideas underlie much of economic theory, and have also had an important influence in such diverse areas as evolutionary biology, psychology, political science, computer science, law, statistics, philosophy, and the foundations of mathematics."
<BJKlein> k
<BJKlein> Where do we go from here?
<MichaelA> Interactive decision theory is what the prisoners in the Prisoners Dillemma are supposed to do
<BJKlein> It's open season for discussion...
<Discarn8> Michael - but how many really do?
<MichaelA> Gordon, can you go over your distinction between rationality and Rationality again?
<BJKlein> Why is rationality important... etc..
<Gordon> to be a little more precise, rationality means using the BPT
<Gordon> Bayes's Probability Theorem
<MichaelA> Well, the Prisoners Dillemma has never really happened in real life to my knowledge, at least not recently
<Gordon> Michael: I'll get to it eventually
<MichaelA> Ok
<MichaelA> Let's see, does making the "rational" decision have more to do with your method of reasoning or situational knowledge?
<Gordon> the BPT *always* picks to best choice
<Yudkowsky> incorrect, Gordon
<Gordon> it maximizes gain and minimizes loss
<MichaelA> It depends on what a priori information you have though
<Yudkowsky> Bayes' Theorem doesn't do that either
<Gordon> well, I'm right on the second count
<Gordon> damn
<Yudkowsky> Bayes' Theorem tells you the truth
<Yudkowsky> it doesn't directly specify decisions
<Yudkowsky> and it doesn't always tell you the correct belief
<Yudkowsky> only more often than any other alternative
<MichaelA> Yes, that would make things very easy if it were true though, heh
<Yudkowsky> indeed
<Discarn8> Gordon - how well does it differentiate between what you're willing to trade versus what you feel (key word!) you need?
<MichaelA> I think it's important to look at "rationality" as a human convention, too
<MichaelA> It's not always pure and glittery BPT, in the reality of society it has different, skewed meanings
<Yudkowsky> that is a difference between rationality and our concept of it
<MichaelA> True
<Yudkowsky> like the difference between phlogiston and chemical fire
* Gordon is afk
<MichaelA> There is an idealized rationality but also a cluster of connotations surrounding the word "rationality" itself
<Discarn8> Also stereotypes.
<MichaelA> Oh, so many
<Discarn8> Eliezer - would it be fair to say, Bayes theorem tells you the odds, rather than the truth?
<Yudkowsky> with two codicicls
<Discarn8> *nod*?
<Yudkowsky> (forgive my spelling, btw, I'm not typing under optimal conditions)
<Yudkowsky> anyway, the first codicil is that any form of thought always only tells you the odds
<Yudkowsky> and the second codicil is that Bayes Theorem works better than anything else
<Discarn8> Any form of predictive thought, OK - but analysis of the past?
<Yudkowsky> any form of thought can only generate beliefs
<Yudkowsky> the correspondence between beliefs and reality is what we call truth
<Yudkowsky> Bayes' Theorem is a process that produces that correspondence
<Gordon> more reliably than other methods
<MichaelA> Well, so does "guessing"
<MichaelA> Yes
<Yudkowsky> or to be even more preciise, Bayes' Theorem is a mathematical structure that is common to every kind of cognitive process that processes that correspondence at better-than-chance probability
<Yudkowsky> *produces
<MichaelA> Let's see, so human beings are poor approximations to true Bayesian thinkers, is part of meta-rationality trying to isolate those impulses in our thinking which are unBayesian and practice correcting those?
<Discarn8> I've hit my head on this one before, and it may just be my being dense, but knowing - having seen - the rock fall down, in and of itself, is a 'fact', correct? How does BPT fit in that?
<Yudkowsky> when you see the rock fall, you are looking at photons
<Yudkowsky> the rock reflect photons
<Discarn8> Aaaahh.. Gotcha.
<Yudkowsky> the photons strike your eyes
<Discarn8> Like I said. *wry grin* Dense.
<Yudkowsky> retinal transducers fire, signals go down the optic cable, in your brain a picture of the rock is reconstructed in your visual cortex
<Yudkowsky> because your brain and the rock are a part of the same universe, a complete physical process can operate whereby one reflects the other
<Gordon> while Eli gives us this lovely example, for those asking earlier, Rationality (with a capital R) refers to the human process that resembles the BPT (rationality, lower case)
<MichaelA> oh, so, nonideal rationality essentially, the human kind of rationality
<MichaelA> but "rationality" is the Platonic construct of a perfect-ish sort of thinking?
<Gordon> it's not a Platonic construct
<Gordon> it's a mathematical process that can be carried out (say, in an AI mind)
<Gordon> it's just that humans cannot, in their own thinking, execute the BPT exactly in the way an AI could
<Discarn8> In other words, this is how a machine intellect doublechecks its beliefs against reality.
* Gordon beeps yes
<Yudkowsky> heh, I've never met a machine intellect
<Yudkowsky> so while I *very highly suspect* they would use Bayes' Theorem I cannot be sure
<Discarn8> *lol* I sit corrected...
<Yudkowsky> but *we* definitely use Bayes' Theorem
<Yudkowsky> I can tell you that
<taza0> I don't understand how it can be exclusive. Can't you consider internal observations as having a direct effect on the external world?
<Gordon> you mean you see a rock fall in your mind and then it does in real life?
<taza0> no, I mean from thought to action
<taza0> what you're saying seems to lake fluidity
<taza0> lack, too
<Discarn8> Do you mean planning or precognition, taza?
<Yudkowsky> if you can successfully predict the state of the universe given each of several possible actions, then the state of the universe can, through you, become altered to the possible state that most closely matches your goals
<MichaelA> I call "rationality" a Platonic construct because even an AI wouldn't necessarily be able to *perfectly* execute the process that is the BPT, (or possible better version), due to quantum randomness and other miniscule perturbations, so it still remains an ideal that can't be touched precisely, if that's your definition of it
<Gordon> so you're asking "What about ideas created in the brain that don't represent some existing part of the universe?"
<Gordon> Ani: true, but so long as the system is robust enough, this will happen rarely enough as to have no effect
<Yudkowsky> Gordon, that is not the case for rationality - the case for rationality is not that it is perfect, or even approximately perfect, because it is nowhere written that perfection must be possible
<MichaelA> Which could quickly corrode the meaning wrapped up in the word anyway, because if all sentient beings have perfect rationality, then there might not be as much of a point in making the distinction
<Yudkowsky> but you can show that the structure inherent in Bayes' Theorem must always produce the best correlation to reality, and that anything which produces correlation to reality must have that structure inherent in it
<Gordon> if I've got it wrong, please correct me, but I think this is how it goes
<Gordon> all nonaccidental processes are using the BPT
<Yudkowsky> all processes which produce correlations between different parts of reality can be described by BT and are fodder for BT
<Yudkowsky> truth is a correlation between belief and reality
<MichaelA> "fodder for BT" = "correctable by BT"?
<Discarn8> So, agreement on a belief is fodder for BT? Even if it's not a "true" belief?
<Gordon> yes
<Yudkowsky> anything and everything is fodder for BT if it correlates to the thing you want to know about
<MichaelA> you could easily spend forever gathering BT input for one decision
<Yudkowsky> for example, widespread agreement on a belief, combined with the additional distinguishing characteristics of dramatic appeal and untestability, is a good sign that the belief is false
<Gordon> Michael: that was the plot of a story; a link was posted around here not too long ago
<Discarn8> Yet two such false beliefs can be proven to be similar by BPT?
<Yudkowsky> suppose you want to know whether your shoes are tied
<Yudkowsky> you can look at your shoes
<Yudkowsky> or you can say, "I want my shoes to be tied, so that's what I belive"
<MichaelA> is there a BPT-ish way of accessing how much time you should rationally spend for a given BPT prediction of reality?
<Yudkowsky> why is one "more rational" than the other
<Yudkowsky> because one produces a better correlation between the belief and its referent
<Gordon> Ani: of course
<Yudkowsky> whether you want your shoes to be tied does not necessarily correlate all that storengly to whether the really are tied
<Discarn8> Gordon: And does it take infinite time to set up fully as well?
<CyKeyMan> Not to tie my shoes, it doesn't.
<Yudkowsky> but if you look at your shoe, you can expect that you will end up believing they are tied if they are tied, and believing they are not tied if they are not tied, almost all of the time
<Gordon> you don't need forever to get the truth
<Discarn8> EY: Gotcha - I think. *brain strain* Um. So, BPT tells you the rationality - the correlation between belief and reality - of a given statement.
<MichaelA> oh, it depends on what your standards are, Gordon ;)
<Gordon> by BPT standards, you don't need forever
<Yudkowsky> in other words, your beliefsl end up being evidence about reality - in the Bayesian sence of being a good, highly correlated indicator - if you look at your shoes, but not if you believe what you want to believe
<Yudkowsky> sorry about not being able to look at the keyboard as I type this
<MichaelA> to get "sufficiently good truth, good enough to tell to other humans", or "perfect truth"?
<Gordon> hmm, you must be in a strange location, Eli
<Discarn8> EY - don't sweat it - you're as coherent as the next genius. *impish grin*
<CyKeyMan> Why do you want to look at a keyboard?
<Gordon> Ani: re Eli's comments on perfection
<CyKeyMan> To see if its shoes are tied?
<CyKeyMan> Keyboards don't have shoes?
<CyKeyMan> Though they do have little non-skid feet.
<Gordon> truth is a relationship between beliefs and reality
<Gordon> we do not know whether it is possible to have a perfect relationship between them
<Gordon> (I think; I'm only in the 1st Grade of Rationality)
<Yudkowsky> but of all the ways of thinking we know of, the one that was named "rationality" seems to work better than any other at producing beliefs that correspond to reality
<Yudkowsky> this is an interesting fact, is it not?
<Yudkowsky> one worth investigating
<Gordon> I think that the only hard thing to swallow is that the BPT is true by its own standards
<Gordon> not true, sorry
<Yudkowsky> why is it that rocket scientists can step on the moon, but tribal storytellers cannot?
<Yudkowsky> they are both thinking using human brains
<Gordon> that should be not 'true'
<taza0> yudkowsky, can you give an example w/ multiple observers?
<Gordon> I meant that the BPT is proven to be our best known method of truth finding by the standards of the BPT
<Discarn8> Tribal storytellers can - what are politicians, after all, and they've at least been in orbit, now...
<taza0> a reconcilation between degrees of truth as gained from different perspectives, I mean
<Discarn8> They fly on the wings of the rocket scientist's mind.
<ChrisRovner> "Western science is based on two great achievements: the invention of formal logic by the Greek geometers, and the Galilean idea that causes could be discovered by experiments" - Einstein
<Gordon> Discarn8: the point is that the storytellers can say that they've been to the moon but haven't; rocket scientists can say that they've been to the moon and they have been
<Gordon> so, let my try again to see if I've internalized this (I seem to get a little closer every time)
<Discarn8> Gordon - Understood, but the counterpoint - politicians in space - was just too tempting for me to pass up. IOW, the storytellers' tales correllate poorly with reality, while the rocket scientists' tales correllate well.
<Gordon> the BPT is a process by which all processes that we know of establish relationships between parts of reality
<Yudkowsky> BT is a structure common to all processes that produce correlations between belief and reality
<Gordon> okay, then what's the difference between BPT and BT?
<taza0> but the storyteller can be the precursor for spaceflight?
<Yudkowsky> Bayesian Probability Theorem is the wrong term, and Bayes' Theorem is the correct term, as Google shows
<Gordon> oh
<Gordon> well, I guess that explains why you were the only one calling it a probability theorem :-)
<Gordon> now, how is that the BT is involved in evolution? in nonaccidental processes?
<Discarn8> taza - indubitably. Give the spark which drives the scientist.
<Gordon> and, is this use of the BT different from what we call rationality?
<Yudkowsky> but the science fiction writers who really sent us to the moon didn't actually claim that people had been to the moon, but rather knew that the stories they told were stories
<Yudkowsky> had they claimed people had been to the moon, then any attempt to actually reach the moon would have been treated as plasphemy when it threatened to disconfirm the stories
<Discarn8> Eliezer - but who gave the prestiege to the story-teller profession which allows the SF writers to make a living doing it?
<Yudkowsky> part of imagination being able to alter reality is knowing that the imagination has not yet happened
<Yudkowsky> it does not require believing things that are not true
<Yudkowsky> discarn, science fiction is far less prestigious than the priesthood even today
<Discarn8> Belief of misapprehended 'truths' often seems to be the biggest stumbling block in the apprehension of new truths.
<Yudkowsky> it is not prestige but simple enjoyment of the readers that allows SF writers to make a living
<Yudkowsky> unlike other, lesser professions, such as modern artist or bioethicicst
<Discarn8> No arguement. However, SOME of the bright lights manage to survive - and thrive! - via their association w/ the clergy's role.
<Discarn8> *chuckle*
* Gordon asks that when this bit out SF is through he'd like his questions addressed, please
<Discarn8> <-- bows aside... A better question than mine.
<taza0> I am bothered by modern art because it asks me to to render meaning; it does not follow communication
<Gordon> taza0: good modern art will still convey to you a clear meaning, although, like in good literature, you may not be aware that it's being conveyed
<Gordon> hmm, either Eli is afk or it's not an easy answer
* caliban glances at the discussion and looks forward to reading the logs tomorrow
<Discarn8> Most anything done by man, IMO, is communication. But this strays a goodly bit from 'rationality'
<Gordon> well, in the mean time, let's continue
*caliban* take care
<BJKlein> see ya caliban
<MichaelA> "communication" is another one of those words with incredibly fuzzy boundaries
<Discarn8> *nod* Indeed. As it should be, IMO.
<Discarn8> Words are those lovely things we all agree to disagree on. And the more important the word to the participants, the uglier the disagreement.
<MichaelA> Well, it would be ok to break it up into 5-10 additional words, but that would be useless because the words wouldn't instantaneously become social norms
<Discarn8> *nod* And once you describe them in terms of communication, the great majority will say, "Well, why didn't you say so?" and start fuzzing them, as well.
<MichaelA> And there's a limit to how many words a language can have and still be useful, too
<Gordon> that's a human limit
<Gordon> not a theoretical one
<MichaelA> Yep
<Discarn8> I'd have to agree w/ Gordon on that one.
<MichaelA> Obviously
<Discarn8> And - a language can have any number of subsets.
<Discarn8> That is - technical jargon used in one venue is nonsense in another.
<MichaelA> True
<Discarn8> Hence, there are ways for languages to sidestep human limitation
<MichaelA> Making up new jargon is one way
<Discarn8> Better - making up new venues.
<MichaelA> And since we think internally using words also, it's like adding new engines to your mental software in a very superficial way
<MichaelA> Depends on how useful the venue is or if its needed
<Discarn8> Valid point.
<Discarn8> The trick is, the venue allows you to agglomerate the concepts as a single body of knowledge.
<MichaelA> So you can sell it off as a package?
<Gordon> knowledge makes the words; words do not make the knowledge
<Discarn8> The word "gene" in context of DNA is different than the word "Gene" in your family tree
<Discarn8> Not so much sell it off, but self-reference and distinguish between conjunctions of meanings at words.
<Discarn8> Words describe the knowledge.
<MichaelA> And sell it off, because new words have limited use if not tuned for memetic propagation
<Discarn8> *quiet grin* Shan't argue that one.
<Discarn8> But, preferably, sell off (in your terms) the venue. One sale, instead of many.
<MichaelA> Sale to whom? oO
<Discarn8> There's the fun part - finding like minds who'd find such a differentiation of terms of use in communicating.
<MichaelA> Very very true
<MichaelA> One thing that immediately drew me to transhumanism was the wealth of new words :D
<Discarn8> There's always the "fun" of memetic assault, as well. Orwellian counter-memeing, etc.
<Discarn8> One thing which turns me off about >H is the excessive number of words. *wry grin*
<MichaelA> Hehe
<MichaelA> Most of them are naturally not used, though, right?
<Discarn8> Oh, sure. Right. Definitely.
<taza0> >H?
<Discarn8> (one example of countermemeing in example)
<MichaelA> I would say that there are only about 20 truly non-mainstream words I use on a regular basis
<Discarn8> >H = Transhumanism
<MichaelA> I'm going to brb in 5-10ish
<Discarn8> Depends on the situation. In many contexts, I try HARD to keep the non-mainstream out of my mouth and off my keyboard.
<Discarn8> *nod*
<Gordon> I suspect that you like learning new words because it makes you part of a tribe
<taza0> a larger word source generally allows for brevity by decreasing context
<Gordon> you feel like part of a community
<MichaelA> Gordon: I don't know, I'll bet there's a small part of that, yes
<Discarn8> Gordon - that's more the venue than the word list, IMO
<MichaelA> But there's the intellectual appeal of opening up new branches in memespace that weren't there before, words are like the nails you nail into a rock face when you climb it
<Discarn8> taza - until you hit someone outside the common venue in use. *wry grin* Then it gets much wordier
<Discarn8> Michael - IMO, true - but only if there are others following the same set of nails.
<Discarn8> And it also spoils that climb for anyone who wants to set their own nails.
<MichaelA> It can partially
<MichaelA> These judgement calls are extremely fine-grained ones
<Discarn8> *NOD*!!!
<Discarn8> And often bereft of context, until there are others attempting a subtly different 'assault' on the same pitch.
<Yudkowsky> words carve up reality
<Yudkowsky> if a tree falls in the forest, does it make a sound?
<Yudkowsky> here the word "sound" fails
<Yudkowsky> you must use "acoustic vibrations" and "auditory experiences"
<Yudkowsky> sound conflates the two
<Discarn8> Interesting... never thought of that one in that light!
<Yudkowsky> it carves up reality to grossly for such fine work
<EmilG> Under some philosophies, it doesn't even make an acoustic vibration.
<Yudkowsky> then place recording insntruments in the wood
<EmilG> Those philosophies are utterly useless to us, though. ;)
<Discarn8> Except for mental exercize, perhaps. *wry grin*
<Yudkowsky> all men are mortal, Socrates is a man, therefore Socrates is mortal
<Yudkowsky> the classic Aristotelian syllogism
<Yudkowsky> supposedly an absolutely reliable means of producing absolutely reliable truths
<Yudkowsky> since, after all, men *by definition* are mortal
<Discarn8> If you and your reader/listener share an absolutely reliable set of definitions.
<Yudkowsky> but suppose that Socrates drinks hemlock, and doesn't keel over
<Yudkowsky> what does Aristotle tell you then?
<Yudkowsky> all men are mortal, Socrates is not mortal, therefore Socrates is not a man
<MichaelA> Socrates has good immunity to hemlock
<Yudkowsky> well... that's helpful
<Yudkowsky> since men are mortal, by definition, we cannot know that Socrates is a man until after we observe him to be mortal
<Yudkowsky> that most two-degged, language-using, ten-fingered, clothes-wearing things are also mortal is an empirical truth, not a logical one
<Yudkowsky> an Aristotelian definition can never produce more than you put into it
<Gordon> I think that's something that's taught but not understood
<Yudkowsky> words are not facts, but hypotheses about the organization of facts
<Yudkowsky> "If a tree falls in the forest and no one hears it, does it make a sound?"
<Yudkowsky> this produces controversy because people believe that "sound" is a substance, a real thing that must be either present or absent from the event of a falling tree
<Discarn8> How 'bout, "Words are symbols representing a given set or sets of facts"
<Yudkowsky> a common misconception
<Yudkowsky> if that were all words were, they would be of little use in rationality
<Discarn8> OK... Where's it wrong?
<Yudkowsky> the word "man" reflects the association of two legs and mortality
<Yudkowsky> and many other characteristics as welll
<Yudkowsky> an empirical correlation that lets us use a few distinguishing characteristics to harvest many other incidental characteristics
<Yudkowsky> we can guess that Socrates is mortal, in advance of observation, because the word "man" bundles together many characteristics including mortality
<Yudkowsky> seeing some of these characteristics, such as a certain shape and appearance, we guess at Socrates's mortality
<Yudkowsky> let's say we have a new word, "wiggin", defined as "a green-eyed man who commits crimes"
<Yudkowsky> a perfectly good Aristotelian definition
<Yudkowsky> as humans actually use words, though, and not as Aristotle thought we used words
<Yudkowsky> "wiggin" is itself a hypothesis
<Yudkowsky> either green eyes and crime correlate
<Yudkowsky> or people who do have green eyes and commit crimes share other incidental characteristics as well
<Yudkowsky> words are clusters of characteristics
<Gordon> qualia, even?
<Yudkowsky> little blobs drawn around points mapped onto thingspace according to their properties
<Yudkowsky> ...qualia?
<Yudkowsky> thus, the word "wiggin" is a false word
<Discarn8> with different dimensions in the various axes - in this case, positive on the green-eyed and criminal axes?
<Yudkowsky> as far as we know
<Yudkowsky> there's no cluster in the place where the word "wiggin" points
<Yudkowsky> right, Discarn
<Gordon> yes, qualia
<Gordon> might you call words clusters of qualia?
<Yudkowsky> um... no
<EmilG> No, qualia are an optional accessory.
<Discarn8> There IS a cluster, but the defining terms used in 'wiggin' does not pull a useful concept from these cluster(s)..
<Yudkowsky> there is not a cluster because wiggins are not produced at any higher probability than chance would suggest
<Yudkowsky> nor do wiggins share any other characteristics
<Gordon> hmm, but I thought the whole point was that minds were believing in characteristics that were stored as qualia
<Discarn8> But there are green eyed criminals. So, by that definition, there ARE 'wiggin's. It's just that the term does not give a... um, symbolic referent of use.
<Discarn8> In other words, it's a pointer to a randomly determined sector of memory.
<Yudkowsky> there are wiggins but there is no cluster
<Yudkowsky> if you map things by eye color and crime, there is no unusual correlation
<Discarn8> So, the cluster is the 'usefulness' of the term, then?
<Yudkowsky> no, the term is useful because it points to a cluster; if there is no cluster, the word is false
<Discarn8> Ah! Much better term - the cluster is the degree of correllation of the term to reality.
<Yudkowsky> like "phlogiston" is false
<Gordon> cluster is more like a correlation?
<Yudkowsky> cluster is a greater-than-chance correlation of properties
<Gordon> aka a significant correlation
<Yudkowsky> no, "significance" is a scientific test
<Yudkowsky> in rational terms, the correct... word... to use, is greater-than-chance
<Gordon> well, `cluster' sounds like any loose connection
<Yudkowsky> if there is no nonchance correlation of the distinguishing characteristics, and no nonchance regularity in the incidental characteristics, the word is useless
<Discarn8> "<Yudkowsky> little blobs drawn around points mapped onto thingspace according to their properties"
<Yudkowsky> in fact, the word deserves the title "wrong" because it leads people to expect correlation and regularity where there is none
<Yudkowsky> there are green-eyed humans who commit crimes... but there are no wiggins
<Discarn8> By defining the term, you've created it. Its usefulness may or may not exist, but the term itself is now extant.
<Gordon> is this just the way human word making words?
<Gordon> we only have use for describing clusters?
<Gordon> *works
<Discarn8> Try, "By definig the term 'wiggin',..."
<Yudkowsky> Discarn, what you're referring to is a form of social conformity
<Yudkowsky> one that can actually operate to defy the truth
<Yudkowsky> to produce falsities in the human brain
<Yudkowsky> so, yes, once you've defined it, the term itself is now extant
<Yudkowsky> a new bit of floating irrationality
<Yudkowsky> just as once you speak a lie, it can float around
<Discarn8> *nod* OK... but is that social conformity real? It affects people, it affects thought processes...
<Yudkowsky> lies are real things, so are irrationalities
<Discarn8> Sorry. Leading question. Yep.
<Discarn8> The question I guess should be, is it rational.
<Yudkowsky> I'd say no
<Discarn8> Oooookay....
<Yudkowsky> time for me to sign off, probably for at least an hour
<Yudkowsky> sorry I couldn't participate more
<taza0> goodbye
<Gordon> bye
<Discarn8> Thanks for the participation... You got my head goin'! *smile*
<MichaelA> Discarn, here's this really weird paper I read yesterday:
<MichaelA> http://www.cs.utk.ed...eology-long.pdf
<Gordon> yeah, every time I'm in the same place as Eli and something is being discussed I realize just how much more I still have to learn
<MichaelA> regarding your "is social conformity real?" question
<Discarn8> *chuckle*
<Discarn8> Gordon - it's a wonderfully disconcerting feeling, no
<Discarn8> ??
<Discarn8> Michael - interesting cite - thanks for the pointer.... Lots of chewy bits, looks like
<Gordon> well, most of the time I think either Eli is talking over our heads and we need to pull him back down or what he's saying is dense enough that it can't currently be understood without a lot of thinking
<taza0> anyone care to apply the concept of social conformity to present thought?
<Gordon> in either case, he's several years ahead of me and I've got a lot of catching up to do
<Discarn8> Gordon - Famous quote from I forget whom. "No pack of poodles will ever understand Bertrand Russel's work, but a pack of poodles COULD stop Bertrand Russel."
<Discarn8> Only years, Gordon? You're doin' a WHOLE lot better than I am. *grin*
<Discarn8> taza - in what context?
<Gordon> I think I've got about 5 years worth of thinking to do to catch up to where Eli is right now
<taza0> I was referring to a general model based on our discussions; specifically, I'd like to hear someone talk about the propagation of lies or untruths
<taza0> and the results, which I would guess be accidental
<Discarn8> Well, lessee....
<taza0> or the operation of rationality within an irrational system
<Gordon> but, I've been moving faster than him, but mostly because he already broke the ground and I don't have to pursue totally new ideas
<Discarn8> The BT gives you a good indicator of how close any given belief (thought/concept/etc) is to reality.
<Discarn8> Hence, it can also indicate how far a belief may be from reality.
<Discarn8> However, there are additional aspects to take into account, in that the belief is not in and of itself a belief, but rather a tool used (perhaps) by one or more to gain control over others.
<Discarn8> Thus, it may behoove someone to spread a lie, due to the utility of control it gives them over the recipient, rather than the utility of clear communication.
<Discarn8> AKA, memetic subversion.
<Discarn8> Does that come close to what you're thinking, taza?
<taza0> I was rather vague. How do we view, say, the standing of Shakespeare contemporarily?
<Discarn8> *blink* Ah.
<Discarn8> <-- changes mental gears...
<Discarn8> Um, short answer - I dunno. *wry grin* Probably a concentration of the value he's had for us, individually, as compared to the difficulties associated with him.
<Discarn8> From that, it expands to take a overview of each of the people involved in the contemporary society, as well as their predecessor's influence on the current society.
<Discarn8> <-- drops a pin
<taza0> so
<Discarn8> Just saw your other quote - rationality in an irrational system.
<Discarn8> H'rmm...
<taza0> I'm not a scientist, but I'd believe most don't apply scientific logic to the rest of their lives.
<Discarn8> *lol*
<Discarn8> I disagree!!!
<Discarn8> It's just that the bulk of people don't see it that way, and are not careful to examine their data for preconceptions!
<taza0> heh
<Discarn8> If you ask someone why they do some particular act, they WILL have some reason. Almost always.
<Discarn8> If not, it'll usually be a trained response. "Why do you bite your nails?" - "Dunno" - when actually it's a way they've found to externalize and rid themselves of excessive levels of stress. (as one perhaps)
<Discarn8> Do the bulk of people follow the scientific method? Yes- but very very poorly.
<Discarn8> Do we use trial and error? You damn well betcha!
<Discarn8> Do we make hypothesis? Uhuh!
<Discarn8> Do we test our hypothesis? Over and over again.
<Discarn8> Do we synthesize our results? Yep.
<Discarn8> Even animals do this, to some degree.
<Discarn8> The great strength of the scientific method, to me, is that it allows us to dispense with much of the time wasting extras between making discoveries.
<Discarn8> Yes/No?
<taza0> My distinction was between conditioning by response, in the narrowest sense, and observation and hypothesis. Negative and positive responses are different, although both are involved together. This was relevant by the context of myticism I was expressing above.
<Discarn8> OK...
<Discarn8> How does one condition a human for a given response?
<taza0> I tell you Shakespeare is good.
<Discarn8> OK. So you've given me a 'fact'
<Discarn8> Assuming I trust you, I take it as a basis for something I may have never experienced for myself.
<Discarn8> This being the case, it may well give me motivation to search out Shakespeare for my own experience.
<Discarn8> I read Shakespeare, with the color of the 'Shakespeare is good' "fact" filtering my perceptions of the book.
<Discarn8> If I were to mildly have disliked Shakespeare, your "fact" might well change that to neutrality.
<Discarn8> Or it might push it to acceptance.
<Discarn8> Each of these variables can be examined, and reversed - in several cases, through a whole spectrum of potential responses.
<Discarn8> Thus - your initial input affected the system, but in some situations it took a neutral reaction "Shakespeare? Whazzat?" and turned it into a negative reaction.
<Discarn8> The various odds are beyond the scope of this bit, tho. *grin*
<Discarn8> Let's say you've gone a step or two further.
<Discarn8> You've found out my preferences, you've found out when I'm most open to new suggestions, and you've found other positive referents.
<Discarn8> Let's say I like a good sex scene, I'm most open in the morning, and I love beer.
<Discarn8> So, to start conditioning me to like Shakespeare, you bring over a 6pack some morning and pop in a video of Romeo & Juliet, a version which goes into great graphic detail their sexual escapades.
<Discarn8> That part done, you leave me a similar prurient copy of "The Taming of the Shrew".
<Discarn8> Etc.
<Discarn8> Conditioning positive, associate the goal with somehting positive from the victim's experience.
<Discarn8> Conditioning negative, associate the gole w/ something negative from the victim's experience.
<Discarn8> Basic salemanship.
<Discarn8> Boy, I'm blabbering tonight. *grin* Make sense?
* EmilG unidles. Should I bother to read all of the above? :)
<Discarn8> Talking about conditioning vs logical response. Prolly not.
<Discarn8> The other stuff, w/ Eliezer, is good, tho'
<taza0> I was getting water
<taza0> I'm still not sure what prompted that, though. I was just arguing what I said initially
<Discarn8> lol
<Discarn8> Sorry... Was extrapolating from your initial "shakespeare is good"
<taza0> heh, I was just curious
<Discarn8> Having stepped in it, I shall beat a retreat...
<Discarn8> *grin*
<Discarn8> Have a good night, folk.
<Eliezer> Hiya
<Eliezer> I'm back
<Eliezer> with a real computer this time
<Gordon> what, were you on a Newton or something ;-)
<Gordon> I e-mailed my questions, but I'll try asking here again
<Gordon> what is the difference between BT as used in rationality and as used by all nonaccidental processes?
<Eliezer> there's only one BT last time I checked
<Gordon> well, okay, but there seemed to be some dissonance when I tried to refer to the BT as being used by all nonaccidental processes
<Gordon> or, so the BT is used in some way that is still unclear to me by all nonaccidental processes
<Gordon> and it is also used in truth finding in the same manner, but for different purposes
<Eliezer> do you know how to define "accident"?
<Gordon> not on purpose
<Gordon> it happens as a result of no goal
<Gordon> i.e. it was not caused by a goal
<Eliezer> what's a "goal"?
<Gordon> that, I'm not totally sure on
<Eliezer> okay... so, basically you overreached the limits of what you intuitively understand when you started talking about rational being used by nonaccidental processes
<Eliezer> what do you understand?
<Gordon> well, I have an intuitive understanding of goal, but I don't think it's a sufficient understanding of goal
*Eliezer* I'm sorry. I did manage to get to a computer but it wasn't my own, and I didn't get to participate the way I wanted to.
*Eliezer* Sigh. IOU 1.
<Gordon> to me a goal is anything that's the cause of directed behavior
<Eliezer> what do you understand about rationality, using your own words and ideas?
<Gordon> well, I understand that rationality is related to the BT
<Gordon> and that it is a way of finding the truth
<Gordon> that works better than anything else I know of
<Eliezer> how do you know that?
<Gordon> in some sense by using the BT
<Eliezer> heh, everything uses the BT - I'm asking how you know, for example, that rationality is related to the BT
<Eliezer> everything not literally everything, just all human reasoning processes
<Gordon> I hypothesis that the BT is the best truth finding mechinism I know, and from what I can see it is
<Gordon> *hypothesize
<Gordon> oh, because rationality is related to truth finding
<Eliezer> why are you talking about Bayes' Theorem at all?
<Eliezer> what's so special about it?
<Gordon> as I understand it, the point of rationality is to be able to find the truth reliably (though I'm guessing that there's more to it)
<Gordon> it finds the truth more often than other methods
<Eliezer> why?
<Gordon> that's about where I get lost; AFAIK it's because experimental evidence says that it does
<Gordon> there's a correspondence
<Gordon> that's not deep enough, though, is it?
<Eliezer> this is what I mean when I say that you overran your intuitive understanding
<Eliezer> have you ground through Bayes' Theorem a couple of times? the arithmetic?
<Gordon> yes
<Eliezer> can you give a nonobvious example of Bayes' Theorem at work?
<Gordon> give me a minute
<Gordon> or two
<Gordon> sorry, the only ones I can think of involve probabilities\
<Gordon> unless did you mean something that it is not obvious that it has a probability? That's a lot tougher for me to know, since I'm not all that sure on what is thought as obvious and not to have a probability
<Gordon> the most nonobvious examples I can think of is in differential reproduction
<Eliezer> can you, from today's discussion, give an example of how words use BT?
<Gordon> okay, maybe, the probability that a cluster has a word given that the cluster occurs in the society that speaks the language the word is in
<Gordon> from differential reproduction, maybe the probability that an organism will reproduce given that its parent reproduced
<Eliezer> how about clustering itself?
<Gordon> the probability that clustering will occur given a set of characteristics exist in a greater-than-chance correlation
<Eliezer> not the probability that clustering *will* occur
<Gordon> does
<Eliezer> given that a cluster *does* occur
<Eliezer> and that a word points to the cluster
<Eliezer> how is the word a usage of Bayes' Theorem?
<Gordon> P (a word | a cluster); P ( a cluster | a greater-than-chance correlation)
<Eliezer> maybe you should go back and review what I said earlier about clustering
<Eliezer> look for something that might hide a usage of Bayes' Theorem
<Gordon> usefulness of a word = P (a cluster | greater-than-chance correlation of characteristics)
<Eliezer> at this point, you're just plugging random words into Bayes' Theorem
<Eliezer> take a *specific* word
<Eliezer> like... "man"
<Eliezer> give me an example of how using that *specific* word, constitutes a use of Bayes' Theorem
<Gordon> sorry, lost my connection
<Gordon> usefulness of wiggin = P ( green eyes and committing crimes occur together | green eyes and committing crimes have a greater-than-chance correlation)
<Eliezer> too abstract
<Eliezer> a specific human
<Gordon> no, wait, make that:
<Gordon> okay, usefulness of Gordon = P (beard and glasses occur together | beard and glasses have a greater-than-chance correlation)
<Gordon> or, wait, maybe I can do better
<Gordon> usefulness of Singularitarian = P (belief in Singularity and desire to create Singularity | belief in Singularity and desire to create Singularity have a greater-than-chance correlation)
<Eliezer> yep... random words
<Gordon> ?
<Eliezer> none of the things you just said, make any sense
<Gordon> yeah, I noticed that
<Gordon> so I've still got it all wrong
<Gordon> I started to get some idea, though, that the existence of a particular word is tied to tied to the existence of the cluster it represents
<Gordon> okay, try this one:
<Eliezer> hint: when in doubt, be as specific as possible
<Eliezer> drop abstractions in favor of the things they're abstracted from
<Gordon> Socrates is mortal = P (Socrates is a man | man is mortal) (not a word, but maybe that's a correct use of the BT)
<Eliezer> the probability that Socrates is mortal, equals the probability that Socrates is a man given that man is mortal??!
<Gordon> no, that doesn't make sense either
<Gordon> geez, it's hard to apply it in these cases
<Gordon> okay, trying with a specific use of a word
<Eliezer> moreover, stop trying to toss around P( foo | bar)
<Eliezer> adding that to something doesn't make it Bayesian
<Eliezer> first, learn to describe things verbally
<Eliezer> otherwise you totally lose track
<Gordon> okay okay, the probability that I know the word pinecone is the probability that I have experienced the cluster of characteristics that make up a pinecone given that a cluster of characteristics make up something commonly referred to as a pinecone
<Eliezer> okay... one, that still didn't make much sense
<Eliezer> two, you are still talking *about* words
<Eliezer> not showing how to *use* words
<Eliezer> moreover, delete "words" and substitute "concepts"
<Eliezer> don't tell me about the probability that I know a word, or the probability that a cluster exists
<Eliezer> let us take as background that I know the word, and that it refers to a real cluster
<Eliezer> show me how using the word is an instance of Bayes' Theorem
* BJKlein puts out a request for next week's chat
<BJKlein> Plus I'd like to take this opportunity to thank all the participants for your valued input...
<BJKlein> Eliezer, GordonW, MichaelA.. and other .. Bravo!!
<BJKlein> however, I must admit, I fell asleep about an hr ago...
<GordonW> grr, there's something that I'm missing; all I can up with more things that don't make any sense
<BJKlein> sleep/wake cycle is a little wacked
<GordonW> s/more/are/
<Eliezer> Aristotle says: all men are mortal, Socrates is a man, therefore Socrates is mortal
<Eliezer> pardon me, that's not exactly right
<Eliezer> Aristotle says: all men are mortal by definition; Socrates is a man; therefore by definition Socrates is mortal
<Eliezer> can you spot the logical flaw?
<GordonW> if Socrates drinks hemlock and doesn't die, he might still be a man?
<Eliezer> the logical flaw is that the supposed "logical syllogism" might be wrong
<Eliezer> it is possible that Socrates will drink the hemlock and not die
<Eliezer> maybe he's... immortal
<Eliezer> so what's wrong with the syllogism?
<Eliezer> if we find that Socrates is immortal, what went wrong with the syllogism?
<GordonW> the definition of man
<Eliezer> how can a definition be "wrong"?
<GordonW> it does not correspond to reality
<GordonW> it doesn't match up to a cluster
<GordonW> rather, it doesn't have a cluster
<Eliezer> now you're repeating things I said, so screw *that*
<Eliezer> there's sure as hell a cluster where humans are
<GordonW> sorry, I'm not consciously repeating you
<Eliezer> but in any case, we're talking about Aristotle now
<Eliezer> Aristotelian definitions have nothing to do with empirical clusters
<Eliezer> they are simply collections of characteristics, each individually necessary, all together sufficient, for membership in the Aristotelian class
<Eliezer> how can an Aristotelian class be "wrong"?
<GordonW> the Aristotelian class is not a real cluster
<Eliezer> why should an Aristotelian class need to be a cluster?
<GordonW> because for it to be useful it needs to correspond to clusters that really occur
<GordonW> otherwise it doesn't describe reality
<Eliezer> but Aristotelian definitions are simply logical classes
<Eliezer> they make no mention of clustering
<Eliezer> nor do Aristotelian syllogisms make mention of clustering
<Eliezer> so I repeat: all men (by definition) are mortal; Socrates is a man; Socrates is (by definition) mortal; Socrates drinks hemlock and lives; what went wrong?
<GordonW> so in that case they are a logical construct, not a construct that maps onto reality, but that still doesn't get at the logic problem
<Eliezer> yes - what's wrong with the logical construct?
<Eliezer> how does the logic fail?
<GordonW> Socrates is not a man (by definition)
<Eliezer> continue
<Eliezer> keep going
<GordonW> or Socrates does not actually live, but we'll count that one out
<GordonW> okay, so Socrates is not a man (by definition), ergo either the definition is wrong or Socrates is something that shares characteristics of men but is not an example of the cluster man
<GordonW> well, nix that cluster stuff, because you just said that it was out
<Eliezer> yes
<Eliezer> *now* move up a level of abstraction: what's wrong with Aristotelian logic in general
<GordonW> so make it share characteristics with the definition of men but is not an example of a definition of man
<GordonW> it's wrong in that it does not reliably assign things to classes
<Eliezer> actually, it does reliably assign things to classes
<Eliezer> if something has some of the necessary characteristics, but not all of the necessary characteristics, it is not a member of the Aristotelian class
<Eliezer> perfectly reliable
<GordonW> well, the classes don't really mean anything related to reality, but I don't think that matters
<Eliezer> let's accept Aristotelian definitions as they stand
<Eliezer> collections of individually necessary, and together sufficient, characteristics
<Eliezer> Someone asks: "Is Socrates mortal?"
<Eliezer> (this being just before Socrates is about to drink the hemlock)
<GordonW> ah, in that case I would say yes
<GordonW> because he has, up to now, displayed all the characteristics of men
<Eliezer> Someone answers: "Yes. All men are (by definition) mortal, Socrates is a man, therefore Socrates is (by definition) mortal."
<Eliezer> and adds: "And that's not just a guess, it's a logical truth. It's correct *by definition*. We know absolutely that Socrates will die when he drinks the hemlock."
<Eliezer> Socrates drinks the hemlock and lives
<Eliezer> what went wrong?
<GordonW> Socrates was not a man, as we already established
<GordonW> up a level, though, I'm still not sure what's wrong
<Eliezer> so the person didn't really know whether Socrates was a member of the Aristotelian class, "man"?
<GordonW> correct
<Eliezer> but once you know Socrates is mortal, *then* you can finally conclude that Socrates is a "man"
<Eliezer> but you can't know whether Socrates is a "man" until you've separately observed all the necessary characteristics required for membership in the Aristotelian class
<GordonW> ah, but he's dead, and he's not a man anymore :-P (I know, not really related)
<Eliezer> here is a question: can reasoning on Aristotelian classes and Aristotelian syllogisms *ever* produce new information?
<GordonW> no, it's all set up a priori
<Eliezer> so Aristotelian logic is *absolutely* useless?
<GordonW> yes, because it only tells us what we already know
<Eliezer> okay, now suppose you're present when Socrates drinks hemlock
<Eliezer> I ask you to guess whether Socrates is mortal
<Eliezer> what do you guess?
<GordonW> yes
<Eliezer> because Socrates is a man, and men are mortal?
<Eliezer> note that we are *not* now using Aristotelian classes anymore
<Eliezer> just regular words
<GordonW> wait, did he already drink it or not?
<Eliezer> not yet
<Eliezer> he's about to
<Eliezer> you have to guess whether he'll die
<GordonW> then yes, because up until now he's displayed characteristics which lead me to believe he falls into the cluster `man'
<Eliezer> how can you use those characteristics you already know, to produce information about something you *don't* know?
<GordonW> BT
<Eliezer> ooh, good answer
<Eliezer> how, specifically?
<GordonW> socrates is a is mortal given that socrates is a man and has up until now displayed characteristics of men
<Eliezer> when you say "is", is that an absolutely reliable conclusion?
<GordonW> now, this is a probability
<Eliezer> show me how you would compute the probability that Socrates is mortal without actually seeing him die
<GordonW> the probability that socrates is mortal given that socrates has displayed all the characteristics of men up until now
<Eliezer> *ah*
<Eliezer> where does that probability come from?
<GordonW> the BT :-P
<Eliezer> well, sure, but where did you get whatever specific probability you're using?
<Eliezer> is it part of a logical definition of "mortal"?
<GordonW> hang on, noise interference
<eg0mech> hmm, i have no idea what i'm talking about here, but i want to play too
<BJKlein> hehe
<GordonW> sorry, I lost my train of thought, trying to get it back
<GordonW> okay, I've been told that men are mortal
<Eliezer> told?!
<Eliezer> unscientific
<GordonW> okay, I've seen men die
<Eliezer> ooh, better
<GordonW> thus I've induced that men are mortal
<GordonW> since I've never met an immortal man
<GordonW> to the best of my knowledge
<BJKlein> that's right you havent met me yet
<Eliezer> hm... this thing called "men"
<Eliezer> it intrigues me
<Eliezer> why do you apply the label "men" to some things and not to others?
<GordonW> man is a cluster of characteristics
<Eliezer> well, so is "wiggin"
<GordonW> of the the characteristics of man is mortality
<GordonW> it has other ones like bipedal, has nose, has hair, evolved from apes, etc.
<Eliezer> I see
* EmilG unidles.
<Eliezer> so you started out by observing bipedal things, with noses, and hair, that wear clothes and employ tools and speak language
<Eliezer> and these characteristics were all correlated with each other
<GordonW> yes
<Eliezer> like... things that wear clothes, also employ language, and contain red blood, and have ten fingers, and so on
<Eliezer> and you sort of assumed that it made sense to check whether wearing clothes... the whole cluster really... also displayed correlation on this new characteristic, "mortality"
<Eliezer> and lo and behold, it did
<Eliezer> so now mortality is one of the things you tend to conclude, when you see that a thing wears clothes and uses language
<GordonW> now, given that socrates has most of the characteristics that I think of as men having, I conclude that he also probably has the characteristic mortal
<Eliezer> *now* use Bayes' Theorem to formalize your reasoning
<GordonW> and I can find just how probably given the number of characteristics of men he displays
<GordonW> okay, so, the probability that socrates is mortal is the probability that socrates is a man given that socrates displays the characteristics of men
<Eliezer> mm, good enough, I guess
<GordonW> yes, I know that there's more to it, but I'm not sure how to express it
<Eliezer> can you tell me why Aristotelian logic *cannot possibly* draw on Bayes' Theorem?
<Eliezer> and why your use of the word "man" to guess that Socrates is mortal *must* draw on Bayes' Theorem?
<GordonW> because the BT uses priors that can change
<Eliezer> actually, one question at a time
<Eliezer> first - why does Aristotelian logic not draw on BT?
<GordonW> because in Aristotelian logic all the priors are part of the reasoning process, where in BT they are used to get the probabilities for the reasoning process
<GordonW> fell on my face again, huh?
<GordonW> because Atristotelian logic is based on absolute priors, rather than probabilistic priors
<GordonW> in Aristotelian logic man is a defined set of characteristics
<GordonW> in BT man is a cluster of priors that have been observed to occur together with a greater-than-chance probability
<GordonW> ping -- I'm still connected, right?
<BJKlein> yup
<GordonW> okay, so we're just waiting on Eli
<taza0> he would be the necessary mover
<GordonW> well, he has a known bad connection
<GordonW> hmm, 0% packet loss
<GordonW> that's okay, I can wait a little bit for the True Knowledge ;-)
<GordonW> (just kidding there, of course)
<Eliezer> what you're trying to verbalize is the intuition that science is based on observation, while Aristotle just made stuff up as predefined
<GordonW> yes
<Eliezer> so you think of Bayesian deduction as being based on empirically observed clusters, while Aristotelian reasoning is based on clusters that Aristotle made up
<Eliezer> in fact the problem is worse than that
<Eliezer> Aristotelian classes don't use clustering - period
<Eliezer> they are collections of individually necessary, together sufficient characteristics
<Eliezer> no mention of clustering there
<Eliezer> Aristotle does not say that, e.g., all clothes-wearing creatures are mortal
<Eliezer> but rather that you *cannot* know, at all, whether Socrates is a "man", until you observe him to be mortal
<Eliezer> it's not just based on Authority's clusters
<Eliezer> it isn't based on clusters at all
<MitchH> Could we say that aristotle's logical categories are akin to suggestively named lisp tokens?
<Eliezer> why, gee, I do believe they are
<Eliezer> it's not the point under discussion, but as points go, it happens to be dead on target
<EmilG> Heh. Good one Mitch :)
<Eliezer> heh
<Eliezer> Gordon's Net connection is not absolutely reliable
<Eliezer> it's a mere empirical regularity
<EmilG> Eli: I always wondered why the BPT is even called a "theorem"...
<EmilG> It's not like something that has a difficult or interesting proof.
<Eliezer> or even a "proof" per se
<EmilG> Yeah. Algebraically it's just a few products and sums.
<MitchH> It my own effort to keep the discussion alive: are "fitness landscapes" as they would be used in AI a way of mathematically identifying "clusters" of characteristics -- somewhat akin to the way our visual modalities tend to identify things in terms of objects rather than very busy panoramas of photons?
<Eliezer> nope
<Gordon> hmm, I guess my own lisp token comment never made it?
<MitchH> gordon: no
<Eliezer> fitness landscapes occur when you're considering several different versions of something with respect to some standard of fitness
<Gordon> that's okay, you got pretty much the same point across, Mitch, just a little later
<Eliezer> thingscapes are different spaces
<Gordon> so, anyway, second question?
<MitchH> fitness landscapes are more for decision making then?
* MitchH lets Eliezer and Gordon continue their discussion
<Eliezer> MitchH - if the fitness criterion is "desirability", for example
<Eliezer> anyway, Gordon (or anyone): can you give a *proof* that Aristotelian "logical truths" *cannot* draw on Bayes' Theorem?
<Gordon> sorry, I can't think of one :-(
<Eliezer> anyone?
<Eliezer> hint: logical truths are true of all possible worlds
<MitchH> Being arbitrarily defined, they are only arbitrarily executed: if anyone disagrees over what Aristotle means by any of his terms, they relay no information. What is man? what is mortal? Bayesian constructs could be identifiable without any language at all?
<EmilG> Not me, I haven't been following this conversation from the beginning to know what Aristotelian logic is.
* MitchH unleashes a ramble
<MitchH> Philosophy = endlessly recursive definition of semantics. That is my response.
<Gordon> I think you're going to have to help us out on this one; we're not quite up to it yet other than seeing that Aristotelian logic is empty
<Eliezer> why, under Bayes' Theorem, is an observation X, evidence about Y?
<Eliezer> i.e., you observe X rather than ~X, and from this you adjust your confidence in Y rather than ~Y
<Eliezer> why?
<Gordon> because P (Y | X)
<Eliezer> that's not Bayes' Theorem
<Eliezer> cite the actual definition
<EmilG> There are several formulations
<Gordon> P (Y | X) = P(Y & X) / P(X)
<Eliezer> hm, Gordon's is a new one on me
<Eliezer> I'm trying to think it through to see if it actually equates to the BT I know
<Gordon> that's the way I learned it
<Gordon> it calls it BT in the book
<Eliezer>




1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users