Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.
What Constitutes "me"?
#121
Posted 15 November 2004 - 02:07 PM
At some point as the quality of experience between two individuals is shared completely, their merged perspectives and shared memories combine to form a new self that contains both contributing members for at least the duration of the joining.
Only the truly loving, tolerant, and fearless need apply please. )
#122
Posted 15 November 2004 - 02:29 PM
Also what you're describing here of the experiences of two individuals being shared completely is not as far reaching as a convergence of timelines would be. At such point of convergence the two beings would be identical, equivalent to perfect duplicates of each other.
I explain this further in that thread.
#123
Posted 15 November 2004 - 04:42 PM
This is after all how memetic mutation supplants genetic. It adapts far faster.
sponsored ad
#124
Posted 15 November 2004 - 10:45 PM
Lazarus, I have to hand it to you, you think things out very well, and I like the style...
Okay, not much to argue with here. I agree with most of what you say, if I make a single assumption. Without that assumption, I don't necessarily disagree with much, but it becomes somewhat irrelevant.
The assumption has to do with simulation and duplication. It also goes back to Don Spanton's question of whether he becomes a simulation of himself via the incremental neuronal replacement model, and at what point this happens. This of course goes all the way back to question 3 at the start of this thread:
A variety of views have been expressed here, and as with much in these philosophical debates, a single basic assumption can take the same body of scientific evidence and theories and come to completely different conclusions. At some point a personal belief must enter the system, even if only veiled as an "assumption" based on probabilities: even those probabilities are giving purely arbitrary values to certain basic stipulations that affect the entire value structure. I like cosmos' usual approach of defining uncertainty as his most basic value (again an assumption), but one that seems to be the most resistant to self-contradiction.If I copy my brain, only one neuron at a time, replacing each neuron with that copy, over a long period of time, will I still be me at the end? When will have the transition occured?
However, in this case, I will not sit by idly. To the task, then.
First, to answer Don Spanton, as I promised previously. While I speak of course with admitted uncertainty and with admitted ignorance, I personally am inclined to think that, if transferred into a purely deterministic simulation (e.g. what I call a "classical computer"), then yes, you become a simulation of yourself. A simulation with complete fidelity, perhaps, but the new you will not be "you". No outside observation would be able to determine this; only you are aware of what's you and what isn't. If the transition is smooth and slow, then there might not be a specific moment in time at which you are no longer you. I've been thinking about this over the past few days, and I can come to no other conclusion.
Part of my reasoning has to do with this whole idea of complete fidelity. I have seen it argued by many that a perfect copy of one's self is the same as the original, so that if killing one's self could somehow benefit the copy, they would do it.
In my opinion, it is not the complete fidelity of information that makes a copy of you really "you". The whole perfect copy thing is a red herring. As with many realizations I've had recently, this came to me by way of a dream (it's sad to have received so many insights from dreams, because it only continues to remind me of the prophets of many religions who received such inspired divine messages, and yet I don't consider my "revelations" to be divinely inspired; rather, it's just the work of my mind operating on things from a different approach...)
I'm sure many have had dreams like this, and I have had many myself. But because I was thinking about what constitutes "me", I read more into it in the moments after the dream when the details, and the mood, and the very thought processes themselves are still in the mode they were in before the dream ended. The post-game analysis, if you will, when you're still hyped up on the game.
At this point in time, I don't remember any of the details; this was several days ago. However, I do remember the post-game analysis. And what I remember was this: in the dream, I was not really myself. I don't just mean I was play-acting some other person: it was as if I were some other person. I had different values, different ways of making decisions and of viewing the world. It was as if I had a completely different belief structure. In the dream, I didn't just act differently, I actually thought and felt and believed things I would not normally think or feel or believe.
In other words, it was as if I had someone else's mind. Forget perfect fidelity, forget marginal fidelity. It wasn't me. But the point that struck me was, "I" was still the one experiencing it. It would be as if I could have "me" transferred into another person, and think and feel as though I were that person, with that person's memories, feelings, emotions, values, everything. But it would be "me" that was experiencing it. I currently do not experience what Don Spanton experiences, he does! Not because of his memories, or his values, or his brain wiring. He experiences those things because... well, it sounds damn non-materialist now that I've said it.
The point is, perfect fidelity in copying isn't the issue. I have no problem with the assumption that we will be able to simulate sentient minds that are indistinguishable from "real" human minds. I have no problem with the idea of copying someone with perfect fidelity, or of merging minds, or splitting minds, or anything. But I separate the outward appearance of those sentiences from the inner "self" of those sentiences. A copy of me is not "me", no matter how perfect a copy, even if there were no physical difference measureable from within the system.
Whether this means that there is a component of us outside the system (the "soul" or ghost or whatever), or if it is a component inside the system which is immeasurable (e.g. quantum states), or whether it's something intrinsic in the system that is not measurable because it simply is (I don't know exactly what I mean by this, it's sort of a vague thought at the moment, I'll try to distill it over the next few days), I can't say.
However, given that a) I don't believe that we can be duplicated, b) I believe that a human mind can be simulated to the satisfaction of any physical test of sentience, and c) that complete fidelity alone does not describe me, then I conjecture that a transferrance of my mind into a classical computer, no matter how complex, will result in a simulation of me rather than me. Whether that simulation is sentient is irrelevant to me, since it's not me.
Now, as I've said before, I'm leaving the door open for non-classical computers, a category to which "quantum computers", a rather nebulous term if you ask me, belong.
A final note related to cryonics: now that I no longer regard complete fidelity as the requirement for "me" to continue, I have a little more respect for the prospects of reanimation. I've seen analysis of how much would have to be preserved to preserve a person, and how much damage could be taken in the freezing/vitrification/other processes involved. Now that I am starting to draw a big divide between the information and "me", I no longer see it as necessary to preserve things down to the molecular level (a very daunting task, currently unattainable). As some, including Robert Bradbury and Ray Kurzweil, have discussed, simply preserving the position and number of neurotransmitter sites on all interneuronal connections might be sufficient.
You see, I break it down into two problems. The first is restarting a mind that is essentially as identical as possible to the mind before the cryonic suspension. The other is restarting the "original" mind. In my mind (pun intended), these are two very distinct problems.
The first requires high fidelity, but places no requirement on the prevention of duplicates, or the issue of destruction and/or external reconstruction.
The second problem does not require high fidelity. I'm not even sure it requires fidelity at all. This is the "me" that's doing the observing of itself, regardless of what state that "self" is in. I am sentient, and Don is sentient (I assume ), but I observe myself, and he observes himself. There's the me of my actions and physical body and even my thoughts and emotions, and then there's the me that is observing all of this.
Now, the ideal situation is admittedly to solve both problems. However, as far as I'm concerned, I would rather solve the second problem at the expense of the first, rather than the other way around. Otherwise, I'm just creating a duplicate of me, and if I'm not there to enjoy it, then why should I care? It would be the same as having a child to continue my legacy, or saving the life of a friend or stranger at the expense of my own. Sure, that person lives on, but I don't. Just because that new person has my thoughts and emotions and memories, that doesn't make me any less lost to oblivion.
That doesn't mean I wouldn't sign up for cryonics: after all, I still don't know if the second problem (preservation of the observing me) will be solved or not, and if it were solved and I didn't sign up, that would be a great loss.
Hmm, looking back, I can see that I'm probably rehashing a long list of positions held by other people which have already been "asked and answered". But I'm new to all this, so I'm thinking in a rather stream-of-consciousness kind of mode right now.
#125
Posted 15 November 2004 - 11:59 PM
I like cosmos' usual approach of defining uncertainty as his most basic value (again an assumption), but one that seems to be the most resistant to self-contradiction.
Funny you should say that, because while I have a resistance to labelling certainties, I do not hold it as unapplicable to all concepts/notions. I would characterize it as the relunctance to label assumed certainties as certainties, in almost all cases my expression of confidence in a view would be in the statement of probability favouring such views rather than certainty in that view.
A final note related to cryonics: now that I no longer regard complete fidelity as the requirement for "me" to continue, I have a little more respect for the prospects of reanimation.
Even throughout this debate, when I discussed perfect and imperfect duplicates and the requirement for 100% fidelity for two beings to be considered identical at some point on the timeline (point of convergence), I did not invalidate cryonics. Any slight and essentially negligible changes that occur to a person in cryonic suspension over time (rather than as a result of duplication errors) would not necessarily change the overall person had he/she been revived. As bgwowk said, changes occur all the time in a living body, and you still consider yourself essentially the same person you were yesterday, the semblance of memories and experiences from the past drawn out with the supposed illlusion of self.
Edited by cosmos, 16 November 2004 - 12:20 AM.
#126
Posted 17 November 2004 - 10:48 PM
It is irrational because ordinary continuity of self does not involve duplication: we change over time, even though this change is usually cumulative and incremental. Therefore, if there can be identity (survival) between dissimilar selves, there certainly should be identity between identical selves.
BUT one aspect of the self in question is a desire to continue the dynamic pattern forward in time (desire for continuity of the evolving self).
So no matter how many duplicates of a self are made, the original self does not wish to accept dying -- it continues to desire to survive, itself, as an incarnate existence beyond any attempt to redefine identities.
All the duplicates, as duplicates, would likewise have the same desire.
It is possible for a human personality to be talked into accepting death, but nevertheless this desire for continuing life exists -- it is merely countermanded. On the other hand, the survival of a duplicate or near-duplicate might make accepting death much more palatable, from a logical point of view.
So, "objectively" (outside the self in question), duplication seems to meet all criteria for survival of the person. Subjectively, however, from the perspective of the person, duplication alone does not satisfy all the criteria for self-survival.
Since a duplicate of a continuing person would feel the same way about its own survival, it would probably be the most practical to understand the duplicate as a separate person, especially since it would begin differentiating across time, and it might be interacting in same-time contexts with the "original".
The duplicate of a destroyed person might have trouble thinking of itself as a continuation of the previous person, but there would be no practical objections to it doing so. It might prefer to see its personal origin in the actual duplication process.
This all relates to present-day human beings and their language categorization schemes. Posthuman mindsharing/groupmind etc. would be an entirely different context.
#127
Posted 19 November 2004 - 08:43 AM
Still, it is an illusion. I wouldn't mind, if my memories were replaced with some other memories, suddenly. I would happily continue my continuity, only to be sure, that I am that other person, in fact.
It is one number 23. Be cause, the number is a concept. It may be written on many places, but it is just one number 23. The self is also a concept, written all around. It was in my head yesterday, I remember. It is today also. I guess it is elsewhere, but I have no memory connection to those occurrences. I have some language information of that from other people, though.
A techno telepathy may one day make this case clear. Peek to another person's mind, and you will fell as one with her/him! As you fell one with the yesteryou.
I am an overyou.
#128
Posted 20 November 2004 - 12:19 PM
Who is real me? Everybody is.
#129
Posted 05 December 2004 - 02:16 PM
"If I upload my brain, will that upload be me?
If I clone myself and download my brain into that clone, will the clone be me?
If I copy my brain, only one neuron at a time, replacing each neuron with that copy, over a long period of time, will I still be me at the end? When will have the transition occured?"
None of them will be you, because they do not have your individual and subjective consciousness. They can be utterly like you, but they will never be you.
#130
Posted 08 December 2004 - 09:34 PM
Then you are already dead, and have died many times. More than half the atoms that compose your brain right now (especially hydrogen and oxygen) will be gone in two weeks, and replaced with new ones. Biochemically, a human being is a waterfall, not a statue. The only thing about you that remains somewhat constant over time is your pattern. The particular matter that composes you is constantly being removed and replaced.If I copy my brain, only one neuron at a time, replacing each neuron with that copy, over a long period of time, will I still be me at the end? When will have the transition occured?"
None of them will be you, because they do not have your individual and subjective consciousness. They can be utterly like you, but they will never be you.
Do you mourn the loss of your predecessors? Do you live in fear knowing that most everything that is now you will be in a wastewater system in a few months? I don't. You shouldn't either. You are a pattern, not particular atoms.
Your own subjective sense of continuity is proof that matter replacement does not destroy subjective self. But of course any entity that was built by matter copying, matter reconstruction, or matter replacement with sufficient similarity to a predecessor will necessarily feel subjective continuity with the predecessor, as you do. So what does that prove? NOTHING! That's the point. If you are confortable with your present existence of having your material body continuously destroyed and rebuilt by natural processes, then it's completely arbitrary to object to artificial processes that would merely do the same thing.
---BrianW
#131
Posted 09 December 2004 - 11:43 AM
#132
Posted 09 December 2004 - 07:44 PM
You raise a very important point. The question of whether humans can survive uploading or transformation into brains of alternative physical form is NOT the same question as whether humans can survive atomic replacement, atomic disassembly/reassembly, or atomic duplication that preserves THEIR PRESENT PHYSICAL FORM. I am only addressing the latter question, not the former.I think that contiunity is required for the expireince line to contunue and I think that compltely replacing you with a machine would not do this pricsiely because it is entirely diffrent in matter and therefroe is a copy
My position is that if a human being goes into a box, and later a human steps out of that box that is no more different from the original than you are after a night's sleep, then that human being is the SAME subjective human being that went into the box. This conclusion holds no matter how long the human was in the box, or what happened inside that box. Period.
Your subjective existence is dependent upon the existence of a particular PROCESS (and maybe a particular kind of substrate), but NOT a particular hunk of matter. Everything we now know in medicine tells us that.
---BrianW
#133
Posted 12 December 2004 - 08:29 PM
As I understand MWI, a person is continually splitting into an enormous number of alternate versions. There are a huge number of present versions of a person which are not physically continuous with each other but which are physically continuous with common ancestor versions.
If you think there is a reasonable possibility that a person multiplies into many versions according to the MWI, then with which of the following possibilities would you most agree?
1. All versions of a person are physical manifestations of the same person.
2. An original person exists only for a brief moment and is replaced by many new persons.
3. There is a single line of versions that is the original person and all other branches are other persons.
#134
Posted 13 December 2004 - 09:56 AM
If only one line of versions is the original, then what determines which "one" of the vastly numerous branches is the "original" line? How do you know that you are not in something other than the original line of versions and the person who is now in the original line is someone other than you?Out of the three possibilities you mention, it's number (3) that I'd go with.
This seems to be more like possibility number (2).Once quantum decohrence has occurred, it seems to me that all the other copies of a person in the different QM branches qualify as *different* people, since they are then all having slightly different experiences and each does not experience any of what the others are experiencing.
#135
Posted 13 December 2004 - 11:14 AM
#136
Posted 13 December 2004 - 02:39 PM
Well, I don't subscribe to MWI, though I certainly like contemplating it. In my mind, if MWI is true, then I don't think that there is an "original" line. This would presume that, even if I've been living in perfect parallel with that line, going forward, the odds are overwhelming that I will not remain in parallel with that line unless I am in that line. Moreover, it presumes that the "original" line's fate is already predetermined (MWI leads, as I understand it, to a situation where one's future is effectively predetermined, regardless of our interpretation of time, i.e. regardless of whether that fate has "happened" already or not).If only one line of versions is the original, then what determines which "one" of the vastly numerous branches is the "original" line? How do you know that you are not in something other than the original line of versions and the person who is now in the original line is someone other than you?
On the other hand, if I merely presume that the line that leads to the moment I am typing this message is the "original" (since, it's the line that matters to me, or at least to the me that typed the word "original" fifteen seconds ago...), then yes, the following statement would hold:
Note that, going foward, the "line" that was original will continue to split. But at any given instant, the only versions that matter are those which paralleled my line up to that instant, plus the fraction that were still "coherent" with that line. We can argue how long decohence takes, but unless it's instantaneous (smaller than the Planck time), then multiple lines will be involved.There is a single line of versions that is the original person and all other branches are other persons
#137
Posted 13 December 2004 - 09:51 PM
1. All versions of a person are physical manifestations of the same person.
2. An original person exists only for a brief moment and is replaced by many new persons.
3. There is a single line of versions that is the original person and all other branches are other persons.
Confusion is caused by failure to distinguish the objective from the subjective. *Objectively*, ALL branching versions are valid continuations of the original person. *Subjectively*, each independent copy perceives only their own ancestral branches as authentic. There is no conflict.
The question is philosophically similar to asking which of the following do you accept:
1) All awake people in the world are conscious.
2) Nobody is really conscious. Consciousness is an illusion.
3) There is only a single person in the world who is conscious (me).
Once again, the answer simply depends on whether you are being subjective or objective.
So it is with all person duplication scenarios (quantum or otherwise). Objectively, all duplicates with reasonable physical fidelity are valid continuations of the original. Subjectively, everytime you are duplicated, you will only perceive yourself as surviving as one of the duplicates (a random selection).
---BrianW
#138
Posted 14 December 2004 - 10:13 AM
I see that the first word in you post is “Well” and the last word in your post is “involved.” For illustration purposes, I will assume that you wrote the first word first. Now, let us say that Jay1 is the person who began the post with the word “Well.” Let us also say that Jay101 was a descendent version of Jay1 who ended the post with the word “involved.” Next, let us say that Jay 102 is another descendent version of Jay1. Rather than end the post with the word “involved”, Jay102 ended the post with a different word. Obviously, Jay 101 and Jay 102 are not coherent with each other. Both Jay101 and Jay 102 can truthfully say, “I am the Jay1 who began my post with the word ‘Well’.” However, Jay102 cannot truthfully say, “I am the Jay101 who ended my post with the word ‘involved’.” With this illustration, I can see your point that only one line of versions is you. The Jay 101 line of versions is you and the Jay102 line of versions is not you. However, both Jay101 and Jay102 could truthfully claim to be Jay1. There are no contradictions here because J101 and Jay102 are subsets of Jay1. It is like saying that Spike is a dog and Rover is a dog but Spike is not Rover.
Well .... involved.
Now, my question was, “Which ‘one’ of the vastly numerous branches is the ‘original’ line?” In the context of a set of divergent descendant versions, I would have to say that the original line stops at the last common ancestor. Beyond this, none of the versions within the set can be within the original line because that would exclude other versions from being within the original line. A tree can have branches but a line cannot.
MWI would actually keep you alive in this way. The point of death is not well defined. Suppose a person dies at 15:17:21 on some Tuesday afternoon. In the MWI, there is certain to be descendant versions of the dead person at 15:17:22 that will somehow have returned to life. These versions may be a tiny percentage of the descendants of the 15:17:21 person, but they will keep splitting into more persons. A certain fraction of those descendants are certain to be restored to good health. This sounds comforting. However, it may not be comforting to think that a large fraction of the descendant versions may spend the rest of their lives in a very unhealthy condition.It is not comfort to this version of 'me' to know that copies of me live on in other QM branches. Those copies in the other branches are not 'me'. If I die in this QM branch I'm dead. Period.
Edited by Clifford Greenblatt, 16 December 2004 - 09:28 AM.
#139
Posted 03 February 2005 - 12:48 PM
Yours
~Infernity
#140
Posted 22 February 2005 - 07:30 PM
Subjectively, everytime you are duplicated, you will only perceive yourself as surviving as one of the duplicates (a random selection).
How do you know this is random?
#141
Posted 08 April 2005 - 11:21 AM
--------------------
We chase misprinted lies
We face the path of time
No one to cry to
No place to call home
My gift of self is raped
My privacy is raked
If I can't be my own
I'd feel better dead
Oh well oh would "you" "feel" once "you're" dead?
~Infernity
#142
Posted 08 April 2005 - 02:08 PM
It doesn't make any sense to draw an arbitrary line on the amount of matter in the brain that can be replaced. Everything that makes up our mind is just the reception of signals, so if those signals are recieved properly, everything functions normally, my mind is unchanged. The mind is the product of the signals.
Picture an internal combustion engine. The brain would be the individual parts, but the mind would be the process of the operation of the internal combustion engine, the amount of gas that is sucked through it, the amount of air, the contents of the exhaust, the amount of torque. If the alternator on the engine breaks, the whole car could be rendered non-functional just like if parts of our brain break today. But if you replace the alternator, the car doesn't know the difference, it resumes it's car-function to the letter.
Likewise, if you were to make a perfect scan of that engine and simulate it on a computer, you would still have all the essential elements of the engine, you'd have the 'identity' of the engine on the computer, and could run it just the same as you did before with perfect results. Appealing to the loss of brain matter is dualism. The only important part of a brain replacement is reacting (and changing) to the same stimulus the same way as the previous tissue did.
#143
Posted 02 June 2005 - 09:24 AM
#144
Posted 02 June 2005 - 12:10 PM
I suspect that for many people the distinction between virtual and real is all too blurred. The old debate about life being an *illusion* now has the variant that life is a simulation. However if it is and you cannot tell the difference then what does it matter?
The real issue is when the computer simulation is the model that is able to run a robot assembly program for building new engines and that stage of development already exists.
#145
Posted 09 June 2005 - 04:06 AM
the me as of june 8 2005 is a single subjective experience, but then say for example i succesfully uploaded as two separate subjective experiences and destroyed the old body. what happened to the one I am now? did it just die, because what was one became two different ones? they are two divergent subjective experiences. they can't both be the me as of now because each entity can only subjectively percieve their new self, not the other one as well. or can they both be me, separately?
#146
Posted 09 June 2005 - 02:48 PM
1. Your copy will be alike you only in the moment of finishing the copying process, but both shall have a different life experience since then.
2. When you die, and your copy remain living- doesn't mean things are OK now, because you do not exist, because *you* have no consciousness. It's just promising somewhat that the information of you TILL the moment of completing the copying experience remains. Not like you'll be bothered by it...
Yours
~Infernity
#147
Posted 30 June 2005 - 01:19 AM
Suppose you happen to be a man of Western European descent. Suppose a man of central African descent is approximately your age and of approximately the same intelligence. Now, suppose “you” are able to live for millions of years without ever losing spatio-temporal continuity of animation. However, after a million years, “you” have been radically transformed into a posthuman through a very long process of gradual modification. Memories of “your” early years have been gradually moved to an historical storage vault to make room for filling “your” mind with much greater things. After a million years, gradual changes have accumulated to extremely radical changes both physically and mentally. Despite some significant differences, the man of central African descent has a great deal in common with you both mentally and physically but the posthuman “you” is much more like an alien from another galaxy. To be consistent with naturalism philosophy, the man of central African descent would have to come radically closer to qualifying as being “you” than the posthuman “you”.
#148
Posted 30 June 2005 - 01:48 AM
#149
Posted 30 June 2005 - 05:36 AM
#150
Posted 30 June 2005 - 08:26 AM
I was simply trying to avoid using anything too close to identical twins in my example. You could swap the two descents in the example. Women could substitute "woman" for "man." Nothing insulting. degrading, disrespectful, or insensitive was intended in the hypothetical example. However, I am open to whatever suggestions you may have to change the two descents to whatever you prefer provided no one else would be offended. Please let me know and I could edit the post.I happen to be a man of central African descent, you insensitive clod! *Runs away weeping*
There is some discussion on this in The New You thread. Hopefully it would give you some additional explanation or ideas from others to eliminate whatever confusion you may have.So confusing.
Edited by Clifford Greenblatt, 30 June 2005 - 08:52 AM.
1 user(s) are reading this topic
0 members, 1 guests, 0 anonymous users