• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
- - - - -

A Question for Those Who Don't Believe in the Soul


  • Please log in to reply
158 replies to this topic

#121 eternaltraveler

  • Guest, Guardian
  • 6,471 posts
  • 155
  • Location:Silicon Valley, CA

Posted 21 November 2004 - 04:05 AM

as I already said I am tired of talking in circles. If you choose to read the article then do so, but if you don't then don't.

#122 Thomas

  • Guest
  • 129 posts
  • 0

Posted 21 November 2004 - 10:13 AM

Well, it is another topic "What constitutes me", in which the answer to this question is.

You don't need an exact copy, you need a copy good enough, to run the same self referencing program.

#123

  • Lurker
  • 0

Posted 21 November 2004 - 10:59 AM

elrond when I finish your article, and assuming it presents your position accurately, would you be willing to address discrepencies and other issues that I will raise?

From what I've read as of yet I'm not very impressed with the way the article deals with the dillemma we've been discussing so far.

sponsored ad

  • Advert

#124 eternaltraveler

  • Guest, Guardian
  • 6,471 posts
  • 155
  • Location:Silicon Valley, CA

Posted 21 November 2004 - 09:10 PM

elrond when I finish your article, and assuming it presents your position accurately, would you be willing to address discrepencies and other issues that I will raise?

From what I've read as of yet I'm not very impressed with the way the article deals with the dillemma we've been discussing so far.


I think the problem is, on both sides of this issue, the answer just seems completely obvious. You have to take one side or the other as one of your basic assumptions.

The article is based on my underlying assumption. Basically this, if any process can possibly create a duplicate even if only one exists at a time, then it is not you. To me this seems very obvious. To you it does not. However I have not seen any compelling reason thus far to make me change this opinion.

Now there are many scenarios I can think of where some percentage of you survives. Take out one hemisphere of your brain for example and replace it with an Identical hemisphere. I would guess that something in the order of half of you survives, perhaps less.

The article does share most of my underlying assumptions. I don't see a way to prove either side, so no, the article does not prove my side, but it does have lots of thought experiments that relate to my assumptions. I do not necessarily agree with the reasoning of all the thought experiments. But I do agree with most of them.

I'll address points you bring up from the article, but honestly I really don't see that going anywhere.

""I know" is just "I believe" with delusions of granduer"

#125

  • Lurker
  • 0

Posted 22 November 2004 - 05:07 AM

elrond said:

The article is based on my underlying assumption.  Basically this, if any process can possibly create a duplicate even if only one exists at a time, then it is not you.  To me this seems very obvious.  To you it does not.  However I have not seen any compelling reason thus far to make me change this opinion.


These differences would seem to be irreconcilable then. I think we've been through enough thread pages to come to that conclusion.

In the end, it may take explicit scientific understanding of the mind-brain relationship and consciousness (to the extent we can understand consciousness) to change your position.

#126 eternaltraveler

  • Guest, Guardian
  • 6,471 posts
  • 155
  • Location:Silicon Valley, CA

Posted 22 November 2004 - 07:45 AM

In the end, it may take explicit scientific understanding of the mind-brain relationship and consciousness (to the extent we can understand consciousness) to change your position.


Or to change your position :)

#127

  • Lurker
  • 0

Posted 22 November 2004 - 07:50 AM

indeed.

#128 bgwowk

  • Guest
  • 1,715 posts
  • 125

Posted 22 November 2004 - 08:08 AM

cosmos wrote:

In the end, it may take explicit scientific understanding of the mind-brain relationship and consciousness (to the extent we can understand consciousness) to change your position.

On the contrary. There is NOTHING that can change elrond's position. There is no experiment, discovery, or insight that could ever change it. Something that is true by definition cannot be changed. Elrond's position is very simple. He defines people as objects. A duplicate object is not the original object. That is indisputably true as far as objects are concerned, and always will be. Case closed.

The issue that elrond is unwilling to confront is the question of whether people ARE objects to begin with. If people are not objects, then the truth that duplicate objects are not original objects is irrelevant.

elrond wrote:

Now there are many scenarios I can think of where some percentage of you survives. Take out one hemisphere of your brain for example and replace it with an Identical hemisphere. I would guess that something in the order of half of you survives, perhaps less.

What elrond does not appreciate is that this has already happened to him many times. Isotope studies show that the half-life of most hydrogen and oxygen atoms in the human body and brain is 10 days. The water that passes through us everyday is constantly exchanging hydrogen atoms with the macromolecules of our body, and even whole proteins are eventually changed out. Both hemispheres of elrond's brain have been replaced many times throughout his life. He is not, and never was, a single object.

We all live our lives subject to a continuous, relentless process of duplication and destruction of our material being. Ongoing duplication and destruction are as fundamental to human life as breathing. Consequently any doctrine that holds human beings are specific physical objects, as opposed to something else that merely requires an object as a vessel, is a doctrine that says we all die and are replaced by a doppleganger every few months. I don't make my life plans according to such a doctrine, and I suspect neither does elrond.

---BrianW

#129

  • Lurker
  • 0

Posted 22 November 2004 - 09:51 AM

bgwowk said:

The issue that elrond is unwilling to confront is the question of whether people ARE objects to begin with. If people are not objects, then the truth that duplicate objects are not original objects is irrelevant.


Correct. Perhaps in time, he (elrond) will be confronted by the nature of human existance as a process of constant replacement. A dynamic process, rather than a static object.

People generally aren't as absolute in their views over time, as the words they use to express those views at the time.

#130 eternaltraveler

  • Guest, Guardian
  • 6,471 posts
  • 155
  • Location:Silicon Valley, CA

Posted 22 November 2004 - 10:24 AM

Yes well I am close minded and down-right pig headed :)

#131 jaydfox

  • Guest
  • 6,214 posts
  • 1
  • Location:Atlanta, Georgia

Posted 22 November 2004 - 02:44 PM

Hmm, this started off as a small reply, and quickly turned into a "stream-of-consciousness" style rant. Heheh, pun not intended.

We all live our lives subject to a continuous, relentless process of duplication and destruction of our material being. Ongoing duplication and destruction are as fundamental to human life as breathing. Consequently any doctrine that holds human beings are specific physical objects, as opposed to something else that merely requires an object as a vessel, is a doctrine that says we all die and are replaced by a doppleganger every few months. I don't make my life plans according to such a doctrine, and I suspect neither does elrond.

A red herring, a misrepresentation, and a bit sad that you would push such a statement after all the time we've spent in here. I thought we'd already moved beyond that point, not just several posts but several threads ago. Neither Elrond nor I have a problem with the slow swapping of atoms that occurs over our lives: at worst, it represents a very gradual "death" of our old selves, so smoothly merged with the new as to be for all intents and purposes completely irrelevant. At any given moment we retain well over 90% of our original structure, even assuming 10% is "actively" being reconfigured and swapped out for new proteins, mRNA, and basic chemicals such as water, glucose, lipids, vitamins, minerals, etc. Besides which, we are as much if not more "process" than "object", active chemical and electrical signaling than passive weighted neural network. But my brain is still my brain, and not the brain of my doppleganger living in some similar spiral galaxy 8 billion light-years away.

Somewhere in the universe, at some point in time, perhaps even at this instant, there might be another person out there who is virtually (if not completely) identical with me to the atomic/electric/kinetic level, and I may never even know about it. Is that person me? I certainly contend he is not. A person is both a process and an object, in the sense that a process is duplicable, and an object is not; I contend that in this case, we are more like the object. However, a process can change its constituents and retain its identity, while an object cannot; here I contend we are more like the process. We are neither mere processes nor mere objects; we are both.

Brian, your position is just as close-minded and dictated by belief as Elrond's, but somehow it's okay to bash on Elrond. There is no evidence, and only a very shaky philosophical argument, that a copy with 100% fidelity is sufficient to be "me", especially if we can argue successfully that it is not necessary. Your side can no more be proven than our side: it requires a fundamental belief that we are either unique or we aren't. Science can't prove it or disprove it to 100% certainty; it can at best say one side's probably right or not. Consciousness is a deeply fascinating topic, one that a person can spend his lifetime trying to understand with the best theories and technology available, but at the end of that lifetime, all one has is a better appreciation of consciousness, and some strong beliefs or assumptions. It's a question that cannot be answered with the same clarity as the question of what is 2+2. You're concerned with whether a copy of you can be "you". For me, that question is completely stupid; obviously it's not "you". I'm more concerned with the differences between our unique "selves", even duplicates. The color I see as green, do you see it as red (even though you call it green)? Would my duplicate see it as "green", since we share the same physiology and neurological structure? Or is it not a matter of neurological structure, but just an abstract, unquantifiable property of our consciousness? Some or all of those are probably stupid questions to you, but to me, they makes perfect sense. We are each observers of our selves, a situation where we truly are greater than the sum of our parts. I know it can't be proven, or rather, it most likely can't be proven.

I don't even see that it needs to be: take the abortion debate. Both sides were at an impasse, because the basic question of whether there's a soul in a newly fertilized embryo is unanswerable by science, and unquestionable by religion.

In the 90's, I figured that meant that it would basically become a battle of opinion and public policy: majority rule. However, science is proving that it can deal with the issue in a practical manner, something religion hasn't attempted yet. And that's what matters: practicality versus who's right. In the abortion debate, science and philosophy are leaning ever more towards realizing that the human mind is the seat of the "soul", the part of human life that matters, and the brain is the seat of the mind, and the brain is open to science. It's a best approximation of this "soul", and one that I suspect will lead to a ban on third-term abortions in the next 10-30 years. Possibly even some second-term abortions. The question of soul as something detached from the mind will remain for the religions to debate, while science will consider it a red herring and will have moved on to issues that "matter".

So I think it will be with the issue of duplicates. I suspect that technology will be taylored to meet the needs of us obstinate non-duplicists: scenarios such as Don described, where I can have a copy of me made, and we can share our lives, either through occasional sync'ing, or even through a continuous connection.

Or rather than upload, we can simply have a direct interface with powerful AI (based on our own minds, I suppose), and this hybrid version of us would allow us to retain our "selves" while having access to the power of raw computation that will be available in the future.

Or one of dozens of other scenarios that do not require the destruction of our original selves.

Meanwhile, you can happily upload yourself, or be destroyed and recreated somewhere else, and it won't really bother me. If you are destroyed and then recreated, both of you look pretty much like "you" to me, and that's all that matters. I suppose this is one point you and I can agree on: from an external observer, both of you will be identical. And this is why science cannot answer the question, because science will ALWAYS be in the role of the external observer. However, I, as an internal observer to myself, am the only person qualified to decide if I still exist, and given that my "observer" can only by in one place at a time, to me it's fairly obvious that a copy of me isn't me. Even at the moment of duplication, each copy of me has an observer that cannot observe the other, so it's not like divergence is the issue.

And if there is a difference, science might measure it, but if it doesn't, that does not mean the difference is not there. It's a provable but not disprovable claim; questions that aren't the realm of science do not mean the questions are invalid; they mean science is insufficient. Like facts versus values: values are not provable. Does that mean that we toss them out the window? Go ahead if you want to, but I contend that you would be the obstinate fool then, not me. At a minimum is the value of self-termination: without values, no finite logical system can ever completely avoid making the decision to self-terminate, unless that option is not known or physically possible. Yet that basic value to not self-terminate, or the set of values which leads to that value, are not defensible by science. Would you assert then that such values are meaningless and should not enter into a debate of life and consciousness? If not, then why would you exclude the belief that we are not duplicable?

But like I said, if you make copies of yourself, it doesn't affect me, so to all external observers, I suppose it doesn't matter. Upload to your heart's content, and destroy yourself to your heart's content. In fact, given that a copy of you could always through mere chance be created, I don't see why you don't assume that if you were to kill yourself at this very moment, you would survive. Why does that copy of yourself have to be saved somewhere? Why does it have to be recreated? Doesn't the mere possibility of such reconstitution qualify?

#132 jaydfox

  • Guest
  • 6,214 posts
  • 1
  • Location:Atlanta, Georgia

Posted 22 November 2004 - 03:23 PM

I cannot be duplicated within this universe under the continuity principle for a very simple reason: I cannot diverge in space.


In my hypothetical (non-MWI) fission scenario, you can and do diverge in space by virtue of nanomachines swelling you to make room for cell fission giving rise to side-by-side cells and cell connections which are then carefully disentangled and moved appart. Or don't even disentangle the two parallel networks. Run them both in-situ, or selectively activate one or the other as you please. I assert that they are both you, and you will survive no matter which of the two are activated.

Interesting scenario, a complex one that required complex thought. Or so I thought; the answer at least is wordy.

This symmetric fission is not as neat as you would put it. Splitting the cells in situ requires one of three assumptions:
1) The cells have to shut down or lose substantial functionality while splitting.
2) The cells can split, but, while splitting, are still single cells in two "separate" but conjoined networks, allowing for a complete interoperation of the two new brains for a brief period.
3) The cells can split and yet, while splitting, they can somehow make sure there is no "crosstalk".

I suspect the first scenario is the most likely; it is at least the most reasonably technically possible. And it would in effect kill me in the process; if not outright, then at least by degrees it would render me substantially dead. That it might only kill me "in degrees" is mildly problematic to my assertion that I am not duplicable, but not so much as to dissuade me from that assertion.

But, let's not limit ourselves to technical ease, but consider that which is at least remotely possible.

In the second scenario, the crosstalk would be about as disastrous as complete dysfunction: again, it would probably kill me in the process, even if not outright. Again, this might be problematic to my assertion, but not enough to dissuade me.

In the third scenario, the lack of crosstalk would imply to me that each cell would have its essense essentially assigned to one brain or the other: a splitting of the "soul", if you will, although probably not exactly 50-50. Losing such a substantial portion of myself so suddenly leads me to believe that the process would kill me, perhaps even more so than the first two scenarios. This is ironic to me, because this third scenario is the least technically problematic: no dysfunction and no cross-talk.

However, I think my third scenario needs to be further refined.
3a) The functional portions of the neurons remain with one of the brains, and new functional portions grow and develop within the original neuron, but for the new brain. Then the cells split.
3b) The functional portions themselves split, then migrate as necessary to allow a clean fission of the cell.

3a, in my opinion, takes us back to the beginning: one brain is given preference, and either it is the seat of the original "me", or "me" is destroyed and two new persons are created. But, this is not symmetric, so we can exclude this scenario.

3b is only a further drilling down of the process: from brain, to functional portions (hemispheres, lobes, etc.), to neurons, and now to interneuronal connections and neuronal components. I suspect that, as with a simple assymetric division of neurons, the division of these smaller components can never by truly symmetric: the best you can hope for is a fair 50-50 apportioning of original components.

Well, we can drill down further, but eventually we hit the molecular and even atomic levels. Still, given the limits of 3D space and traditional 4D spacetime, we can never achieve a truly symmetric split. At least, not without divergence in another dimension, either a coexisting but non-interacting space, or a truly diverging spacetime. Neither of these is testable or forseeably testable, but we cannot rule them out.

Brian, I do have to admit one interesting thing I've discovered with the thought experiment.

As I observe the drilling down process, from brain to hemispheres to neurons to interneural connections, etc., to molecules to atoms, I still don't see a means of preserving "me" during the split: at best, I truly do retain only 50% of my original essence.

But, what's interesting to me is the level it is accomplished at. As I've contended before, 100% fidelity is not important, continuity is. This process preserves fidelity, which is nice, but it also presents a troubling question for continuity. I've always assumed that I cannot be split, because, after all, which part of me would get which parts?

But in taking this level-of-detail approach, I am now picturing it more as a, well, a picture. Take a picture, and cut it in half. Clearly, you cannot reconstruct it. Cut it in quarters, and remove opposite corners. Still, you cannot reconstruct. Cut the picture like a chess board, and remove the "white" squares. Again, you cannot reconstruct the picture, but you're getting closer. Well, what happens if you cut it up into a million by million grid, and remove every other piece? While quite a bit of detail is missing, it can be reconstructed. Would this apply to the two "observers", the "souls", or whatever we label them? If they were material, even if in a "nonmaterial" way (whether outside the system, or as part of a composite quantum state), then I would have to say possibly yes. If not, then possibly still no.

So I'll not concede defeat, but I will admit you have me stumped for the moment. I'll continue pondering this and get back to you.

By the way, since we've been addressing the topic of "belief", I suppose I should point out that, being at heart an agnostic, I admit that my belief that I am not duplicable can never be shaken. I only point out that it is a core belief which is extremely obstinate, even if not unmovable. As I assume with you Brian, your core assumption that we are duplicable could someday be moved as well.

Thought experiments like this one are good. I "value" this debate!

#133

  • Lurker
  • 0

Posted 22 November 2004 - 03:57 PM

jaydfox said:

So I'll not concede defeat, but I will admit you have me stumped for the moment. I'll continue pondering this and get back to you.


I was left in the same position when I previously held your's and elrond's view. You may reconcile that issue as you see fit, but I tend to agree with bgwowk though.

By the way, since we've been addressing the topic of "belief", I suppose I should point out that, being at heart an agnostic, I admit that my belief that I am not duplicable can never be shaken. I only point out that it is a core belief which is extremely obstinate, even if not unmovable. As I assume with you Brian, your core assumption that we are duplicable could someday be moved as well.


So to be clear your view in unshakable, without sounding condescending, it is literally an absolute belief.

-----------------------------

It was not an easy decision for me to side with bgwowk, his position is seemingly existentially disconcerting. In theory I agree with him, in practice I don't know if I could take part in the duplication/self-termination exercise (particularly self-termination). While I have a relatively high degree of certainty in such position, I take notice of my doubt and/or uncertainty when something may threaten my existance. I don't require certainty in most decisions, this one included, but I will err on the side of caution before making such decisions.

edit: 666 posts [mellow]

#134 jaydfox

  • Guest
  • 6,214 posts
  • 1
  • Location:Atlanta, Georgia

Posted 22 November 2004 - 04:30 PM

edit: 666 posts [mellow]

It's the human side of these debates that keeps me here!

It was not an easy decision for me to side with bgwowk, his position is seemingly existentially disconcerting. In theory I agree with him, in practice I don't know if I could take part in the duplication/self-termination exercise (particularly self-termination).

By the way, I find it interesting that Nate and I had this discussion in the past about values and the decision to self-terminate. At the time, I thought it was obvious that self-termination is bad, and the timbre of the discussion seemed to imply this was universally a given. But given the apathy some seem to view self-termination with--in light of the duplication problem--I suppose that even that value cannot be assumed to be a good one! Are we therefore led to a condition where no value can be assumed to be of "value"? And hence no belief can be of value?

#135

  • Lurker
  • 0

Posted 22 November 2004 - 05:01 PM

jaydfox, believe it or not I opened a link to that thread yesterday.

The issue you bring up is whether we can assign universal values. What I've found is that assigning such values as universal is more difficult than one would think. Particularly if we are trying to describe all beings within those values, or even just humans.

#136 bgwowk

  • Guest
  • 1,715 posts
  • 125

Posted 22 November 2004 - 05:21 PM

jaydfox wrote:

Neither Elrond nor I have a problem with the slow swapping of atoms that occurs over our lives: at worst, it represents a very gradual "death" of our old selves, so smoothly merged with the new as to be for all intents and purposes completely irrelevant. At any given moment we retain well over 90% of our original structure, even assuming 10% is "actively" being reconfigured and swapped out for new proteins, mRNA, and basic chemicals such as water, glucose, lipids, vitamins, minerals, etc.

When you say "at (within) any given moment we retain well over 90% of our original structure", that depends on how you slice your moments. If we define a moment as one year, that's certainly not true. If we define a moment as a picosecond (the timescale of individual chemical reactions), then you retain 99.9...% of your original structure at any given moment even if I swap out all your atoms in one day.

Perhaps moments should be defined in terms of our conscious awareness, such as whether the duration of the replacement is long compared to the timescale of our thoughts. But no, that doesn't work. Someone can be in a conventional coma for several months, experiencing no thought at all, and when they wake almost all their atoms have been replaced. (This is reality, not thought experiment.) And we still consider them the same person. Presumably if they had been frozen instead of in a conventional coma, and robot nanomachines instead of conventional biochemistry had replaced their atoms over several months (years? weeks? hours?) we would also consider them the same person.

So this distinction between "slow" and "fast" swapping seems quite arbitrary. Why would atomic replacement over a 10^19 picosecond second timescale (the natural one) be philosophically okay, but 10^17 picosecond not? I can see no reason.

Somewhere in the universe, at some point in time, perhaps even at this instant, there might be another person out there who is virtually (if not completely) identical with me to the atomic/electric/kinetic level, and I may never even know about it. Is that person me? I certainly contend he is not.

The only natural way such a person could come to exist is if his surrounding environment was also virtually identitcal to yours (parallel universe), given that the state of your brain is a product of its past interaction with the environment. So if you did suddenly subjectively wake "elsewhere", it would be hard to detect. If you were observing scientific instruments measuring some fine physical process at the same time this happened, you might witness an inexplicable random change in some parameter. But guess what? That's exactly what we do see. Our subjective awareness drifting among multiple versions of ourselves is the origin of randomness in MWI quantum mechanics.

---BrianW

#137 jaydfox

  • Guest
  • 6,214 posts
  • 1
  • Location:Atlanta, Georgia

Posted 22 November 2004 - 06:48 PM

When you say "at (within) any given moment we retain well over 90% of our original structure", that depends on how you slice your moments. If we define a moment as one year, that's certainly not true. If we define a moment as a picosecond (the timescale of individual chemical reactions), then you retain 99.9...% of your original structure at any given moment even if I swap out all your atoms in one day.

No, not 99.9....%. I'm taking into account that the swapping out of hydrogen and oxygen and other elements, and indeed the swapping out of proteins, mRNA, etc., is a process, not an event. If we assume that those molecules in the process of swapping out are temporarily "offline", that's probably a sizeable percentage of our brain material out of commission. However, 90%-99% of our matter (though presumably 99.9....% of our consciousness) remains the same. And if we assume that those molecules that are swapping out are also part of the consciousness, then the number does become closer to 99.9...% of our matter is the same at any given moment. I was drawing a distinction between the matter and the consciousness, and also making a worst-case assumption about how much the swapping out process really affects our brain material.

Someone can be in a conventional coma for several months, experiencing no thought at all, and when they wake almost all their atoms have been replaced. (This is reality, not thought experiment.) And we still consider them the same person. Presumably if they had been frozen instead of in a conventional coma, and robot nanomachines instead of conventional biochemistry had replaced their atoms over several months (years? weeks? hours?) we would also consider them the same person.

This is again going back to our definition of biostasis. As far as I'm concerned, it doesn't really change things much. Of course, as far as you are concerned, it doesn't change things much.

When frozen, we are not terribly active biochemically, but quite active otherwise. It's not as though time stopped for the underlying matter: electrons still zip along in their course, nuclei do their little dance, and quantum states are in quite a frenetic race of fluctuation. It is within this active environment that I allow the possibility of atomic replacement to work: it wouldn't be much different than active replacement. And there's no paradox: the replacement is done in situ, with temporal continuity. We can't really entertain the thought experiment of simultaneously and instantaneously replacing every atom at once, because it's not really even theoretically possible, except perhaps in fringe QM (where Douglas Adams' party favorites are possible, such as a pair of panties instantly disappearing and rematerializing a few feet away, or people traveling instantly to the other side of the galaxy).

So if we can't stop time for the underlying matter, the next best thing is to effectively stop time for all intents and purposes: reducing the temperature to effectively zero Kelvin. I say effectively, because it doesn't ever really reach zero. What I mean is to reduce the human brain to a Bose-Einstein Condensate, where the entropy is so low as to render the entire brain effectively in one quantum state. It is at this point that you could try to replace every atom, and it wouldn't really matter if it were a gradual process or instantaneous event. You could probably even store the brain and reconstruct it later and it would be pretty much exactly the same. However, what happens to "me" when you turn my brain into a Bose-Einstein Condensate? I'm still not convinced that cryonics will bring "me" back, so you can imagine how I feel about being turned into a BEC.

However, turning the brain into a Bose Einstein Condensate is not something I'd want to try on my brain. But you're free to.

#138 bgwowk

  • Guest
  • 1,715 posts
  • 125

Posted 22 November 2004 - 11:56 PM

Jaydfox,

Is or is not a person in a conventional medical coma for several months, during which time most of their atoms are naturally swapped out, the same person when they wake or not? If they are the same person, then why wouldn't they be the same person if nanomachines augmented the swapping to happen over one day instead of months? Why is the timescale important, and what is it that distinguishes an identity-preserving swap timescale from an identity-destroying swap timescale?

If you do believe that a person in a conventional medical coma for several months "survives" as the original person despite waking in an atomic duplicate body, what would happen if instead of discarding the replaced atoms we rebuilt the original physical body in a room beside the natural replaced-atom body and woke it instead? Who would be the "real" person? The body in which the atoms had been naturally replaced, or the body with all the original atoms?

Consider how individuals and societies historically answer questions like this. It is a natural human tendency to regard any extreme change with caution and fear of death. For example, I fully believe that were you to describe a thought experiment in which atoms inside a person where replaced while they were unconscious 80 years ago, most people would believe that this would kill the original. Today science knows that this happens naturally, and people learn enough chemistry and biology in school to believe it, so they simply accept "slow" atomic replacement as non-fatal. But why should it be non-fatal? I assert the only reason it isn't regarded as fatal is because people have become used to the idea.

Here's another example. When anesthesia was first introduced, there was a serious and real fear that inducing such unprecedently deep states of unsciousness might be "fatal" even though persons appeared to wake after the procedure. Nowadays everybody either knows somebody who's had anesthesia, or has it themself, so it's not regarded as a philosophically troubling issue.

Today most people still believe that loss of brain electrical activity is fatal. The identification of life with electricity is rooted deep in 19th century vitalist doctrine that persists to this day. Many people define human life in their mind as a continuous process of brain activity with the same certainty than some people in this thread define personhood in terms of physical objects. They believe that anyone who ever comes back after stopped brain activity will be a soul-less zombie.

And here we have a perfect living example of how firm beliefs about personal identity can be changed. Take most people who believe that life ends when brain activity stops, and tell them that medicine ALREADY routinely recovers people with stopped brain activity, even after days of no brain activity. They will change their opinion rather than marginalize themselves by adhering to a "weird" opinion about brain electrical activity and survival that will be at odds with empirical medicine. A thought experiment is one thing. Confrontation with reality is another.

We see several examples of this in this thread. Why do some people believe that a person could survive their brain being broken into a few pieces, but not billions? Answer: Repair of the former is conceivable with present surgical technology, the latter not. Why do some people believe that atomic replacement over months is survivable, but replacement over faster timescales isn't? Answer: The former is already known to happen, the latter not. There is no other objective justification of these "lines in the sand" I can think of.

Thus I predict that the firmly entrenched positions in this thread will not change as a result of any new insights into consciousness or cognitive neurobiology. They will only change (should thread participants live that long) when centuries from now a substantial number of people actually go through the kind of thought experiments we are discussing. Even though the outcome of those experiments can already be predicted from physical law, and people can assert what their response would be if such experiments were performed, the REALITY of people waking from stopped brain states (already with us), extensive disassembly and reassembly, atomic replacement and atomic copying will be something else. Seeing is believing, and these things will be routinely accepted to constitute survival of individuals just as anesthesia is today. Even by elrond. :) Count on it.

---BrianW

#139 jaydfox

  • Guest
  • 6,214 posts
  • 1
  • Location:Atlanta, Georgia

Posted 23 November 2004 - 05:30 PM

I see we remain still at an impasse. Both sides believe something ideologically anathema to the other. I believe that a flame is unique. Light another flame on another candle, and it is not the same flame. It is a flame, and neither of us contending whether it is a flame. But you say it is the same flame. To you, the fact that it is A flame means it is THE flame.

In the case of a flame, we could chalk it up to semantics, a curiosity of language perhaps. But I'm not a flame, I'm "me". Semantic curiosities are not something to toy with when we're dealing with "me".

Thought experiments aside, reality changing aside, you can't convince me that I'm not me. I fully concede a complete duplicate (materially) of me can be made, but without some tie to the original other than its identical structure, I just don't see how you can think that I'm now that person. If a perfect duplicate of me were to walk in on me right now, I'm sure I would be weirded out. But it wouldn't do anything to change my mind. Why?

Because I have as much connection to him as I do to you. I am not you, and I don't think you would contend that I'm you. Why in my whole life have I never experienced your life? From picosecond to picosecond, I continue to live my life.

Who knows, perhaps from femtosecond to femtosecond, I live a differenct person's life, never remembering the life just before, and after a few billion femtoseconds, I'm back in Jay Fox's body, living his life. So maybe I am everybody, and I just don't remember it. This makes about as much sense to me as the idea that I am both "me" and a copy of "me".

But no, I don't think I observe your life. I observe mine. And if a duplicate were created next to me, I would continue to observe my life, and he his. And if you destroyed me and rebuilt me a thousand years from now, or a nanosecond from now, I would be gone, and a new person would continue to live my life. And I have no doubt that you could do it in a way that that new person is unaware of it, and that that person, unaware of what happened, and being filled with artificial memories, would claim to be me. I would know different. Of course, I would be gone, so I guess I wouldn't know any different.

You see, we have a fundamental difference of opinion. You don't believe you have an internal observer. I'm not quite sure how you believe this, and you can deny it, but it's obvious from your reasoning that you believe it. Either that, or you care more about your memories than your internal observer. Me, I care about my internal observer. Memories are nice, but I've already forgotten most of them anyway, so I'm not particularly attached to them, as long as they change slowly enough for me to get used to the new ones and the loss of the old ones. So far it's been working out quite nicely.

You? You would gladly give me all your memories and then cease to be. I don't understand it, but since that's your decision, I at least respect it.


One final note: Part of my fear of oblivion has to do with anticipation. Once destroyed, I will not care, and the new person in my place will be grateful for his life. If I were destroyed unaware, and replaced, it wouldn't matter to anyone involved. Not "me", for I am gone. Not the new guy, for he thinks he's me and he'll happily go about his life believing so. Not you, because, well, nothing happened to you.

Kill a man, and he won't care. Tell a man you are going to kill him, and he very likely will care, and quite a lot as well. Tell me that you're going to destroy and recreate me, and I will care profusely. Once destroyed, I won't care, just as the dead man would no longer care. The new person in my place, still on his adrenaline rush from thinking he was going to be destroyed, will look around in wonder that he is still alive, and proclaim that he is still me and that the experiment was a complete success, that he is still "me". As I said, no external observer in this experiment can determine the fact of the matter: "I" am gone.

#140 bgwowk

  • Guest
  • 1,715 posts
  • 125

Posted 23 November 2004 - 06:00 PM

jaydfox wrote:

You don't believe you have an internal observer. I'm not quite sure how you believe this, and you can deny it, but it's obvious from your reasoning that you believe it.

Absolutely not. Of course I have subjective awareness. We both agree we have subjective awareness. We both want to protect our subjective awareness. The argument we are having is what that awareness depends on.

You? You would gladly give me all your memories and then cease to be. I don't understand it, but since that's your decision, I at least respect it.

I absolutely do not approve of this.

Now please answer my question. If a person is kept in a coma long enough for most or all of their atoms to be naturally replaced, will they awake as the original person or not? If someone did awake from a year-long coma (or decades-long coma as in the movie "Awakenings") in a hospital today, would you actually tell them you were sorry to report that their predecessor had died, and they were a new person with the predecessor's memories?

Perhaps you regard your beliefs about death by matter replacement as a purely personal matter. That being the case, perhaps you want laws to be changed so that were you to fall into a coma of more than a few weeks duration your last will and testament could be probated, your organs donated, and your family move on. After all, should you awake a year later, it wouldn't REALLY be you, right?

I'm not being sarcastic. These are sincere questions. Please don't avoid them.

---BrianW

#141 jaydfox

  • Guest
  • 6,214 posts
  • 1
  • Location:Atlanta, Georgia

Posted 23 November 2004 - 07:07 PM

You? You would gladly give me all your memories and then cease to be. I don't understand it, but since that's your decision, I at least respect it.



I absolutely do not approve of this.

Okay, now I'm confused. If my brain we to be rewired so that I had your memories, your "brain", then I would be you. You could then destroy your original, and continue to live on through me. This is not only in the spirit, but in the wording of what you yourself have said. You have said that if an alternate set of atoms were reconstructed into you, it would be you. What if that alternate set of atoms is me? As I said, you would gladly give me your memories and cease to be.

Of course, I wouldn't want to be rewired entirely into someone else, so I suppose this thought experiment breaks down on those grounds. But find a suitable donor, a sentient person with his or her own "me"-ness, and rewire him or her into you.

If a person is kept in a coma long enough for most or all of their atoms to be naturally replaced, will they awake as the original person or not? If someone did awake from a year-long coma (or decades-long coma as in the movie "Awakenings") in a hospital today, would you actually tell them you were sorry to report that their predecessor had died, and they were a new person with the predecessor's memories?

If I thought they were a new person, and that the previous person had died, I wouldn't find it necessary to tell them.

In this situation I am sufficiently ignorant of the processes involved to believe one way or the other. For the point of this debate, I will agree to stipulate that a long-term coma, even on the order of decades, does not affect the preservation of the original.

Even in this case, atoms swapping in and out is a red herring: it is a process. I am not identified by the atoms within me. We can play the mind game of instaneously swapping an atom for another atom of the same element, but that doesn't happen. Atoms swap out in processes, usually in chemical processes where one or more molecules and/or atoms undergo an exchange of electrons or electron states, and afterwards a different set of one or more molecules and/or atoms is left in the reaction's wake. These very processes are a very integral part of our consciousness, so to talk of constituent atoms is somewhat pointless.

Perhaps you regard your beliefs about death by matter replacement as a purely personal matter. That being the case, perhaps you want laws to be changed so that were you to fall into a coma of more than a few weeks duration your last will and testament could be probated, your organs donated, and your family move on. After all, should you awake a year later, it wouldn't REALLY be you, right?

A lot of people have a lot of different ideas about things, as purely personal matters. I don't expect the law to change for anybody just because of a matter of personal matters. It might be a personal matter to me that if I don't kill a 13-year-old virgin at least once a week, my soul will be damned.

However, I do believe that the law should allow people more personal freedom to make decisions that don't affect everyone else. I think Euthenasia should be legal, and suicide for that matter. Killing yourself in a way that could harm others should obviously be off limits, just as using a hunting rifle or a steak knife in a way that harms others should be against the law. Of course, how you punish someone under the law for jumping in front of a bus and causing an accident that injures or kills others, well, that's another matter altogether.

I think people should be able to decide what level of mental or physical treatment or augmentation they are willing to receive. I think that people should be allowed to define when they will consider themselves dead, as long as it is not patently against common perception. For example, arbitrarily saying, "Okay, I'm dead now.", while you continue to stand and breathe and talk and react right in front of me, is not acceptable. Figuring this out becomes quite troublesome if the implications of declaring one's death affect legal matters, so some minimum guidelines are obviously in order, but there should be some room for interpretation. You should be free to upload yourself into a powerful binary computer and say that it's you, and have that program have legal rights as a citizen, etc., as long as it can pass a modified thorough Turing test as being you. And I should be free to say that, if such a copy of me were uploaded, perhaps forcibly, that that copy would not be me.

#142 bgwowk

  • Guest
  • 1,715 posts
  • 125

Posted 23 November 2004 - 08:41 PM

jaydfox wrote:

Okay, now I'm confused. If my brain were to be rewired so that I had your memories, your "brain", then I would be you.

Let's be clear about this. If I rewire your brain to include at-will image recall of my life events, like an in-situ electronic photo album, that would be you with my memories. But if, while we are both unconscious, your ENTIRE BRAIN is restructured at the atomic level to be substatially identical to my brain, that would NOT be you with my life memories. That would be me. Period. You would be dead.

If I feed your brain to a growing animal as its sole food source so that your brain becomes the animal's brain, you don't become the animal with animal memories. You die. You die when your brain structure, and all records of it, are destroyed.

Atoms swap out in processes, usually in chemical processes where one or more molecules and/or atoms undergo an exchange of electrons or electron states, and afterwards a different set of one or more molecules and/or atoms is left in the reaction's wake.

Correct. We are not talking about atom replacement in an abstract sense. In all these thought experiments we are talking about actual chemical processes, by either natural or artificial "nanomachines", that restructure or replace atoms and molecules.

These very processes are a very integral part of our consciousness, so to talk of constituent atoms is somewhat pointless.

Incorrect. You can be completely unconscious in a coma, and natural maintainence and overhaul will still eventually replace your atoms and molecules. Even if your brain electrical activity is completely shudown with drugs (removing all doubt about consciousness), replacement will still happen.

For the point of this debate, I will agree to stipulate that a long-term coma, even on the order of decades, does not affect the preservation of the original.

Then you are stipulating that preservation of the original person does not depend on the particular matter they are composed of. You are stipulating that matter replacement is (at least under some circumstances) acceptable. Agreed?

---BrianW

#143 jaydfox

  • Guest
  • 6,214 posts
  • 1
  • Location:Atlanta, Georgia

Posted 24 November 2004 - 03:16 PM

You are stipulating that matter replacement is (at least under some circumstances) acceptable. Agreed?

Agreed.

But if, while we are both unconscious, your ENTIRE BRAIN is restructured at the atomic level to be substatially identical to my brain, that would NOT be you with my life memories. That would be me. Period. You would be dead.

Forget about while were are both unconscious. If I were conscious, and a sufficiently powerful AI, in concert with billions or trillions of nanobots, were to rewire my brain while it's in its conscious state, into your brain, what would happen? Assuming that we remove quantum states from consideration, and consider only the atomic, electric, and kinetic structure of the brain (i.e. that which can be stored in a classical computer), a very powerful AI could seemlessly manipulate my brain over a period of days or weeks so that my brain became an exact copy of your brain. I assume of course that my body would be restructured either at the same time, or even well before the transition, to be a duplicate of your body. This is only relevant in maintaining a "continuity" (or rather, the perception of continuity, since physical continuity is obviously broken here) of sensory inputs (including your sense of your body).

Part of this would of course require saving the state of your brain in very fine timeslices during at least a few nanoseconds, and possibly even a few seconds, as opposed to an instantaneous snapshot. This is to get a flow that my brain can be worked into. Sort of like handing off the baton in a relay. Or taking that exit from Jay Highway and merging onto Brian Highway. But in a purely deterministic model of the brain it could be done, in situ, while I am conscious, and I would end up being you, materially.

The transition would not be seemless unless my sensory inputs were supplied with the same dynamic states that yours were in when your "snapshot" was taken, but this would be trivial, either through aritificial stimulation (a la the Matrix), or by immersing me in the same sensory environment. If your "snapshot" was taken while you were in a sensory deprivation tank, this would also be trivial, but since we have a sufficiently powerful AI and nanotech, I don't see why the snapshot couldn't be taken while you were in the middle of a vigorous game of soccer or full-contact chess. Creating automatons and setting up the kinetic world around me as the transition completes, the new me would materially be the same as the old you, and in the same environment, at least as far as the senses are concerned.


So, what say you? If I gather your position correctly, you will say that this new person is "you". Not just a copy of you, but you as much as the original. Of course, "you" have had your snapshot taken, and are off living your life, perhaps even unaware that the snapshot was taken.

What do I think? Well, I certainly don't think it's "you". At least, not if you are self-aware in a manner qualitatively similar to the way I am. Of course, from my point of view, the two new people will be "identical", at least in form and in their interactions with the world. That's the thing about the internal observer. You could just be an automaton. Your duplicate could be an automaton. My duplicate could be an automaton. Subjectively, all that matters is the internal observer. Like I've said, there's no way for an external observer to know any different. Only our internal observers are aware. Mine is, anyway. How about yours?

What I'm less sure about is whether this new person is me. That I have doubt either way only shows me that I'm missing a piece of the puzzle; there are unanswered questions. I was deterministically forced into being you, and I think part of that process may kill me by degrees, since I am still leaning towards thinking that "I" am not deterministic. On the other hand, the continuity is there, so at worst the interference is no worse than just taking psychodelic drugs, something that alters my thoughts ever so smoothly and imperceptably that I feel much like I do when I wake up in the morning. Perhaps you know that feeling, when you know that you need to get up, and you came up with a whole list of reasons the night before why you need to get up in the morning, but when the alarm goes off, you just hit the snooze button, even as you're reminding yourself of how important it is to get up.


So perhaps it is still me, with no memory of my former life. Much the same way that I have very little memory of my life at age five, and pretty much no memory of my life at age three. Much the way that I suspect that I will have very little connection to the person I am now, five hundred years from now, even assuming I stay "in the flesh".

So I could go either way, and it's not worth getting into. What I am certain about is that the new person will not be "you", not in the same sense that you think it will be "you".

#144 jaydfox

  • Guest
  • 6,214 posts
  • 1
  • Location:Atlanta, Georgia

Posted 24 November 2004 - 03:19 PM

There are other things I need to do today, including throwing in my two cents in other fora here. And I will be off the Internet until Monday, given the holiday.

So, in order to try to broaden my horizons, and seeing that no harm may come and I might get a deeper understanding of both sides of the issue, I will make an attempt to suspend my objections and view things from your side of the aisle. I don't know how much thinking time I'll get in over the weekend, but let's see where it gets me by Monday.

#145 bgwowk

  • Guest
  • 1,715 posts
  • 125

Posted 24 November 2004 - 08:53 PM

jaydfox,

The person you describe at the end of your thought experiment would be me. Specifically, it would be me continued from the moment the "snapshot" of my original body was taken. My subjective awareness would jump from the snapshot moment to the moment what used to be your brain became identical to my brain at the snapshot moment. Think about it. Because of the way you've defined the physical circumstances of this experiment, that *objectively* would be what this new person reports. You would be dead with the same certaintly that you would be dead if an animal ate your brain and restructured the proteins into its brain.

I think generally in these thought experiments you give too little weight to what the products of these experiment report. Ask yourself this: Why is it that society believes that people who wake from comas, even ones with no brain activity, are still the original people? Why do you yourself acknowledge that people can survive decades-long comas, with essentially complete matter replacement, and still wake as themseleves? Answer: Because you feel the force of social convention that people who wake insisting they are still themselves ARE themselves!

Whenever any outrageous medical scenario is proposed that alters the brain in a way no brain has ever been altered before (anesthesia, complete matter replacement, electroconvulsive therapy, cortical electrical silence, suspended animation, cryonics, ex-situ repair of cryonics patients...), there is always a contingent of people that believes-- ON PAPER --any revived person will not be the original person. Then when these things are actually done-- IN PRACTICE --doubt evaporates, and people who go through these processes are still accepted as themselves. Without exception. I don't know why you think future brain repair and/or mind transport involving extensive disassembly and matter replacement will be different.

The pattern doctrine of subjective personal identity simply states that there is a one-to-one correspondence between physical process and subjective experience. Your personal subjective feeling of continued personal existence at any moment is purely a product of a material process at that moment, and if that same material process recurrs at any future time or place (be it one second later at the same spot, or a century later on another world), your subjective feeling of existence will pick up exactly where it left off. Whatever process happened in between (be it disassembly, reassembly, or matter replacement) doesn't matter. And why should it? By what mechanism could it?

Consider:

1) If I surgically disassemble your brain into a dozen pieces, and perfectly reassemble them, you survive.

2) If I surgically disassemble your brain into a million pieces, and perfectly reassemble them, you survive.

3) If I surgically (nanomachine surgeons) diassembly your brain into atoms, and perfectly reassemble them, you survive.

4) If I surgically disassemble your brain into atoms, and replace any or all atoms with identical atoms, you survive.

Where does this breakdown, and why????? More to the point, how could you ever PROVE (beyond trivial semantics) that the original person did not survive in the same sense you have stipulated that a decades-long coma patient survives?

The inability to prove that the above scenario causes death leads me to the unavoidable conclusion that subjective personal identity is based on pattern and process, not particular matter.

Enjoy your weekend.

----BrianW

#146 jaydfox

  • Guest
  • 6,214 posts
  • 1
  • Location:Atlanta, Georgia

Posted 29 November 2004 - 08:29 PM

Think about it. Because of the way you've defined the physical circumstances of this experiment, that *objectively* would be what this new person reports.

Objectivity is beside the point! How can one objectively define that which is precisely subjective! Call it metaphysics if it makes you feel better, but you cannot objectively define a subjective experience until the day that you can have two subjective experiences with two separate consciousnesses at the same time. Linking our consciousnesses creates one "super"-consciousness experiencing both sets of subjective experiences and doesn't count. Experiencing one and then the other doesn't count. If you want to discount consciousness altogether, due to its inherent subjectivity, then go ahead.

I think generally in these thought experiments you give too little weight to what the products of these experiment report.


Not at all. If you were replaced with a doppleganger of yourself, I would gladly accept that person's assertion that he was "you", especially if I was unaware of the switch. Even if I were aware, there is nothing (that we know of yet) that either of us could do to convince me that the new you isn't "you", other than to invoke an objection based on my knowledge that you had been swapped out.

But it changes when we turn the tables on me.

Think of us as black boxes. You put inputs in, get outputs back out. These are measurable inputs and outputs. We don't really know about the insides of the black box, but we can define its external behaviors well enough to know what's going on. We could swap the black box for a new one, and if it behaves the same, we assume it's the same. But is it?

Let's say our black box seems to reliably return square roots, to four sig figs.. We put in 16, it returns 4.000. We put in 19, and it puts out 4.359.

Well, perhaps the first black box just has tables of square roots for all values from 1.000 to 99.99, with factors of 100 pulled out and converted to 10's. Perhaps another black box uses an algorithmic approach, similar to the one taught in elementary school or junior high. Perhaps a third one has a little pool inside that it fills with 19.00 cubic units of water, then adjusts the length and width in unison until the depth is 1.000 units, and then returns the width or length.

Same behavior, different internals. By your assertion, it's the exact same black box, but clearly something has changed.

Okay, so where's our black box? Well, for millenia, it was the body. Then for mere centuries it was the brain. The black box of the body yielded up its secrets, but we found another black box within.

In the last couple centuries, with more emphasis on the last few decades, the black box that was the brain has been pried open. But inside lies another black box. Can we ever drill down far enough the find the last black box and pry open its secrets?

Perhaps. In fact, very probably. Materialists would assert that we will, definitely. Me, I'm in the probably category. You strike me as the type that assumes we've already found the last black box and are busy cateloging its innards.

But in my view, no classical model can ever hope to be correct. A computer hooked up to a camera can see the world with better resolution than my eyes can, yet neither one of us would contend that it has any sort of "subjective experience" of vision. Well, we could get into a debate about all the extra hardware we have for shape and edge detection, links to memory and powerful parallel complex processing units... It's kind of pointless, as far as I'm concerned. It is always reducible to a classical deterministic Turing machine. (Of course, if memory serves, a non-deterministic Turing machine has no more real power, it just runs faster). A classical computer will never subjectively experience the world the way a human can. More than sheer complexity is required. Complexity is just our way of trying to obscure the results: they are still deterministic.

Sure, we could say that indeterminism can be simulated via pseudo-random number generators or "classical" superposition, or even truly random number generators. In the first case, I think we can agree that this is still deterministic: it is not a quantum state, even if its results in the program are the same. In the second case, we see both determinism and a problem with memory: what computer could ever hope to have the processing power to handle 2^(10^18) superpositions per time cycle? The last case is somewhat ridiculous, as it would essentially undermine the whole basis for assuming the system could be classicaly modelled in the first place.


Quantum mechanics becomes tricky as well, because it is so poorly understood, in the realm of particle physics let alone on the combined nano-micro-macro scale of neurology. I am firm in my "belief", if you would rather call it that, that I am not duplicable. Or rather that "I" am not duplicable, even if my physical brain were. Again, within the realm of classical physical sciences, there is no objective difference between "my" brain and an exact copy; that does not mean that there is not a subjective difference. This does not require me to invoke the immaterial: it is only immaterial within the realm of classical physics.

However, in the spirit of quantum mechanics, I have not ruled out completely that I am not copyable. I could be copied, if the process itself required that my original were destroyed: copyable, but not duplicable. Quantum Teleportation, as it is called, might be an agreeable mode of transport and/or short- or long-term storage. I'm awaiting better theories and insight on that one: here perhaps is a situation where, as you said:

Whenever any outrageous medical scenario is proposed that alters the brain in a way no brain has ever been altered before..., there is always a contingent of people that believes-- ON PAPER --any revived person will not be the original person. Then when these things are actually done-- IN PRACTICE --doubt evaporates, and people who go through these processes are still accepted as themselves. Without exception.


In the case of quantum duplicates, I think you are right. In the case of classical duplicates, you're probably going to see a lot of people, including scientists, who consider classical duplicates as merely a refining of the process of creating wax models of people: damnably realistic, but not the original. That won't preclude us from granting these new copies full rights under the law, as citizens at least. But how would we treat assets? Should they belong to the original or the copy? The original clearly, unless there is some other deciding factor (e.g., a "will" or other legal document saying "In the event I am copied..."). What if the original is destroyed and there is a copy? Should the copy have more right to the assets than the destroyed person's heirs, e.g. next-of-kin? What if there were two copies? Would we treat this as a death and redistribution of assets? I think this copy, being a separate entity, would be treated much as an identical twin and a spouse at the same time: separate, but very closely connected. Multiple copies would be treated like multiple siblings probably: division of assets. Unless there's a way to determine which copy came first or otherwise has a stronger legal claim... Murky times, but probably far enough away as to be beyond our ability to really extrapolate the implications: it will probably depend on what order certain technologies become available, such as reanimation from cryonic syspension, etc., and how those technologies are affected by the prevailing scientific, philosophical, and religions attitudes of the time.

Legal considerations aside, undoubtably we would "morally" give them equal weight as our fellow sentient beings. While the religious nuts will probably decry the fact that they don't have a "soul", I will not go so far. If the brain is duplicated and it functions, as far as I'm concerned it has a "soul", whatever that is for the sake of this discussion: that internal observer which is not duplicable, material or immaterial. But it will not be the same soul, even if at the moment of its creation is occupies the "same" physical body and brain structure.

In the case of a quantum duplicate, there could be only one, so it avoids many of the legal problems. It's also on much sounder physical ground to say that it is the original (I say it as though he or she were a thing, but I mean "it" as in "the copy", which is a neuter noun. No offense to him or her was intended). I suspect that quantum duplicates will be decried by the masses, and feared as described by you in the previous scenarios (comas, anasthetics, etc.), but it will everntually become accepted. However, I strongly disagree that classical copies will be accepted as you have described. As people, yes. As the originals, no. Only as much the original as one's twin or child.


In the end, I'm back to my original conclusion:

Consciousness is a deeply fascinating topic, one that a person can spend his lifetime trying to understand with the best theories and technology available, but at the end of that lifetime, all one has is a better appreciation of consciousness, and some strong beliefs or assumptions. It's a question that cannot be answered with the same clarity as the question of what is 2+2.

We both have our beliefs: you that the brain is a Turing machine with sophisticated software, as and such there is nothing truly subjective in our experience of the world; I that no Turing machine could ever subjectively experience what we humans do, and that we therefore are more than are classical components.

By the way, in case you're wondering, I did at least try to put myself in the position of viewing the world as though a copy of me were me. I managed to creep myself out for a while there. I imagined that I had been replaced the night before, and hence my memory of having this electronic debate with you, an experience very vivid in my "consciousness", was an illusion, for it was an act committed by another person perhaps days or eons ago, before I was created as a copy and placed into this world. Very unsettling, I must admit.

And yet, I couldn't help but think that, if that were the case, then the "me" from which I was copied was long gone, either dead or diverged into his new life. And then I began to wonder if I couldn't apply this logic to the "me"'s of all my yesterdays, and of my yester-minutes, and indeed the me from a few seconds ago. I guess metaphorically one could say that "me" is a constantly dying person, with someone new popping into existence in his place, just to get a brief glimpse of reality before himself dying and being replaced... After a while of continuing this process ad nauseum, I decided it was kind of pointless: at the very least, my experiences have always been of this body, even if experienced by different "me"'s. And as my brain continues to change and evolve with time, with my memories and emotions and my very cognitive systems changing, I have always managed to view the world through me. I am "me", regardless of the changes in composition and time, versions of me far more different from each other than from a perfect copy of me; versions of me far more different from each other in fact than from other like-minded people in history. This of course set me into wondering if there really was something to "past-life" experiences: perhaps one's consciousness did exceed that arbitrary threshold of similarity where one's consciousness jumps subjectively from one's "past life" to this one, a blending of the old and the new internal observer.

Actually, it's funny, because the more I think about it, the more I realize that your point of view enables more mystical points of view than mine does. Ironic, but kind of cool. It's got me interested in researching more about reincarnation and other such phenomena. Thanks!

#147 zeitgeist

  • Guest
  • 8 posts
  • 0

Posted 29 November 2004 - 10:07 PM

That's an easy one!

What you call "my body" is simply that, and "other bodies" are simply "other bodies".
Not much rocket science here...

Of course you could go around saying: "That clod over there is 'my body'", but most of us know that ain't so and we'll be polite and humor you; so as not to offend...

#148 bgwowk

  • Guest
  • 1,715 posts
  • 125

Posted 30 November 2004 - 01:28 AM

jaydfox,

Your digressions into Turing machines, quantum mechanics, and allegations that I believe in a classical model of brain operation (I am neutral on the subject) are completely beside the point. The question is not whether a brain is a classical or quantum computer. The question is whether a person *is* the computer, or whether the person is the information held by it.

Let us stipulate the following so that this issue doesn't get rehashed over and over:

---> An atomically-precise copy of human body is a DIFFERENT OBJECT than the original body. <---

We all agree on that!

But simply asserting that because a copy is a different OBJECT, that proves the copy is a different PERSON, is begging the question. You are assuming the conclusion that you claim to prove. Whether people are in fact objects is the very question we disagree over. From where I sit, it seems darn hard to argue that subjective continuity is tied to a unique object given that 50% of the matter that composed my brain (or yours) last week is now in a drainpipe somewhere.

Now please answer the question of my last post:

Consider:

1) If I surgically disassemble your brain into a dozen pieces, and perfectly reassemble them, you subjectively survive.

2) If I surgically disassemble your brain into a million pieces, and perfectly reassemble them, you subjectively survive.

3) If I surgically (nanomachine surgeons) diassembly your brain into atoms, and perfectly reassemble them, you subjectively survive.

4) If I surgically disassemble your brain into atoms, and replace any of the atoms with identical atoms, you subjectively survive.

Where does this breakdown, and why?????

---BrianW

#149 jaydfox

  • Guest
  • 6,214 posts
  • 1
  • Location:Atlanta, Georgia

Posted 30 November 2004 - 03:47 PM

1) If I surgically disassemble your brain into a dozen pieces, and perfectly reassemble them, you subjectively survive.

2) If I surgically disassemble your brain into a million pieces, and perfectly reassemble them, you subjectively survive.

3) If I surgically (nanomachine surgeons) diassembly your brain into atoms, and perfectly reassemble them, you subjectively survive.

4) If I surgically disassemble your brain into atoms, and replace any of the atoms with identical atoms, you subjectively survive.

Where does this breakdown, and why?????


4) We've already been through 4.
3) No different from 4, as far as I'm concerned.

Debating 1 and 2 is somewhat pointless, as I'm not a 22nd century neurologist. But at this point I'd venture that 2 would not preserve me, and 1 would only possibly preserve me if the "pieces" were separated at interlobal connections, rather than within various lobes (functional units) themselves. While conscious this would seem a straigtfoward conclusion; while unconscious, or in "biostasis", somewhat less defensible, and I'll freely admit it.

But if "me" is preserved during biostasis, then the lack of electrical and chemical signalling is taken out of the picture anyway, and the inherent remaining structure and metabolic chemistry still applies, preserving continuity in a non-duplicable way. It's not really entirely debatable either: to suppose that there is a difference between "conscious" chemical processes and basic metabolic chemical processes also begs the question if there will be differences between basic metabolic chemical processes and artificial ones induced by your nano-bot duplicators. Thus I can be in a coma and remain me, with my atoms swapping out through metabolic processes, but not so when I am disassembled and reassembled by nanobots.



I think we have fairly well hacked this topic to death... Of course, we have copies, and can always revive it later. :))

I have learned a lot, and even changed some of my secondary beliefs and assumptions about cryonics and consciousness. It was worth the exchange.

By the way, it's not that I don't want to continue this discussion, only that certain thought experiments I've been running through have pushed me to continue researching quantum mechanics and it's connection with subjectivity versus objectivity. I'm also still working on my idea of whether determinism, randomness, and free will can be mutually exclusive concepts or "actors", or if we indeed have only determinism and randomness. Probably asked-and-answered questions, but I'm still a neophyte.

#150 bgwowk

  • Guest
  • 1,715 posts
  • 125

Posted 30 November 2004 - 04:14 PM

jaydfox wrote:

But if "me" is preserved during biostasis, then the lack of electrical and chemical signalling is taken out of the picture anyway, and the inherent remaining structure and metabolic chemistry still applies, preserving continuity in a non-duplicable way.

There is no "metabolic chemistry" during biostasis. In vitrification, all atoms and molecules are locked into position in a glassy matrix and can't engage in chemistry at all. Structure is all there is.

It's not really entirely debatable either: to suppose that there is a difference between "conscious" chemical processes and basic metabolic chemical processes also begs the question if there will be differences between basic metabolic chemical processes and artificial ones induced by your nano-bot duplicators. Thus I can be in a coma and remain me, with my atoms swapping out through metabolic processes, but not so when I am disassembled and reassembled by nanobots.

Your position seems to revolve around continuity of "basic metabolic chemical processes" as a condition for continuity of subjective personhood. Is that correct? But *specifically what are those processes*, and what is your evidence that they are necessary for continuity? As I've already said, it is physically impossible for such processes to exist during biostasis. It's also questionable whether such processes can be said to exist in profound hypothermia (which people have already recovered from) given that normal biochemistry is seriously upset under such circumstances. The only real continuity is structure, not chemistry.

While you recognize that electrical continuity is just a popular myth, it seems to me that you are struggling to find a surrogate process that will have the same role in identity preservation that electricity is fallaciously believed to have. But there is no such known process! If you are searching for one, then doesn't that mean that your belief that identity depends on something more than structure is just a hypothesis, although you believe that your hypothesis willl ultimately be proven?

---BrianW




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users