• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
* * * * - 1 votes

Body+Brain emulation transition method dilemma


  • Please log in to reply
86 replies to this topic

Poll: What would you do? (15 member(s) have cast votes)

You are abducted by a top secret research organization. They offer you four choices. Which would you choose?

  1. Gradual transfer of brain patterns to a new and improved you. You are free to go as your new you, with no money. (8 votes [53.33%])

    Percentage of vote: 53.33%

  2. Instant transfer of brain patterns to a new and improved you + you get instantly vaporized. You are free to go as your new you, with $1,000,000 in cash. (0 votes [0.00%])

    Percentage of vote: 0.00%

  3. Instant transfer of brain patterns to a new and improved you + 3 second pause + you get shot in the ventricle with a rifle. You are free to go as your new you, with $1,000,000,000 transfered to a series of secure offshore bank accounts. (6 votes [40.00%])

    Percentage of vote: 40.00%

  4. Deny that you only have three choices, only to get proven right by getting shot in the head. (1 votes [6.67%])

    Percentage of vote: 6.67%

Vote Guests cannot vote

#1 exapted

  • Guest
  • 168 posts
  • 0
  • Location:Minneapolis, MN

Posted 30 September 2009 - 09:37 AM


Scenario:
You are abducted by a top secret research organization. They offer you four choices:

1
They will hook you up to a device that will gradually transfer your brain patterns to a new version of you, looking like you but with more advanced capabilities and a full implementation of future strategies for engineered negligible senescence. As the device transfers each pattern, it disables it in your actual brain, yet maintains relationships between the patterns as if the brain was in one piece by creating links between patterns in your old you and patterns in your new you - it even uses some of the energy from the activity in the brain forming a pattern in your brain to power the corresponding pattern in the brain of your new you. When all of the patterns are transferred, there is nothing left for the device to do, and you are free to go as your new you (your bio body is carefully cryo-preserved after samples are taken).

2
They will hook you up to a device that will simultaneously scan your brain and your body to particle-level precision, vaporize you, and "boot up" a new version of your self. You will notice nothing but a slight flash of light. The process will not have any physical effect on the patterns of the brain, it will simply capture them and, quite literally, move them to your new you. Your new you will have more advanced capabilities and a full implementation of future strategies for engineered negligible senescence. You are free to go as your new you (samples from your bio body will be kept along with your scan), with $1,000,000 in cash.

3
They will hook you up to a device that will simultaneously scan your brain and your body to particle-level precision and "boot up" the new you. Then it will pause 3 seconds and fire a rifle into your heart. Your new you will have more advanced capabilities and a full implementation of future strategies for engineered negligible senescence. You are free to go as your new you, with $1,000,000,000 transfered to a series of secure offshore bank accounts.

They fully verify all of their claims to your satisfaction. Your new you will be more advanced in every perceivable way.

Which would you choose, and why?

Oh, and
4
If you fail to choose, they will just shoot you in the head.

And remember that your money could be used in whatever way you see fit.

Edited by exapted, 30 September 2009 - 10:35 AM.


#2 forever freedom

  • Guest
  • 2,362 posts
  • 67

Posted 30 September 2009 - 01:40 PM

What matters is deciding which of these 3 options has the highest chance of allowing my awareness to continue; i suppose that would be option 1. I think this poll isn't a good one, as money for the person's clone is useless if the real person is dead, so money won't even be considered when making a choice among these 3.

#3 eternaltraveler

  • Guest, Guardian
  • 6,471 posts
  • 155
  • Location:Silicon Valley, CA

Posted 30 September 2009 - 02:10 PM

Ill take option 3. A billion dollars is a huge pay off for a 3 second bad dream I won't remember

sponsored ad

  • Advert

#4 exapted

  • Topic Starter
  • Guest
  • 168 posts
  • 0
  • Location:Minneapolis, MN

Posted 30 September 2009 - 09:58 PM

What matters is deciding which of these 3 options has the highest chance of allowing my awareness to continue; i suppose that would be option 1. I think this poll isn't a good one, as money for the person's clone is useless if the real person is dead, so money won't even be considered when making a choice among these 3.

With option 1, you get to keep your bio body and nothing is violently destroyed. So if there were no money involved, why wouldn't everyone simply pick option 1? I think there are very few possible advantages in options 2 and 3 (maybe some people would prefer not to keep their bio body, but they could always have their bodies cremated afterwards).

I would guess that you believe option 2 has some disadvantage to the degree of at least $1,000,000. If I didn't offer the money I wouldn't be so sure. Because I offered the money and you didn't take it, I believe your "boundary" is between options 1 and 2. If someone picks 2, that person's "boundary" is between 2 and 3.

I was considering offering in-situ replacement, with nano-bots replacing neurons over a 3-day period, then doing the equivalent to the rest of the body. But to me that doesn't really get to core of the issue. What about when you are already fully synthetic, and you want to upgrade yourself to some entirely new computing paradigm? Do you copy the code over, switch the original off, and turn the new one on? If you voted for 1, then I would guess that you might support some kind of gradual process where the internal factors of the mind maintain their relationships. That's really what I'm curious about.

What matters is deciding which of these 3 options has the highest chance of allowing my awareness to continue; i suppose that would be option 1.

Could you explain why option 1 would give you a higher chance of allowing your awareness to continue than option 2 or 3? To you it may be obvious, but to others it may not be.

#5 exapted

  • Topic Starter
  • Guest
  • 168 posts
  • 0
  • Location:Minneapolis, MN

Posted 06 October 2009 - 08:49 AM

Ill take option 3. A billion dollars is a huge pay off for a 3 second bad dream I won't remember

Even if you are sure that it doesn't really matter which one you pick, your beliefs will affect the outcome of the New mind's life, because the New mind will get it's beliefs from your mind. If you believe that the New mind will have a different identity from your own when you pick option 3, then maybe the New mind will walk away with the billion dollars thinking "I have these memories telling me that the New mind will have a new identity. Hey, I am the new mind! I'm a totally new person!", causing the New mind to distance itself from your old life. People are often motivated by issues of identity.

Are you sure that the New mind will have the same identity as you?

Here is some informal reasoning to consider:

Possible solution:

(1) 'Each instance of a mind is an entirely separate (although possibly identical) mind',
(2) 'A mind seeks to discharge it's drives'
(3) 'A mind's short-term drives emphasize actual experience'
(4) 'A mind's long-term drives emphasize identity'
(5) 'A mind de-emphasizes medium-term drives'
(6) 'An emulation of a mind that believes it has a different identity from the original is likely to dissociate itself from the drives of the original'

Analysis of dilemma:
(1) If the original mind believes it's emulation will have a different identity, then the emulation will also believe it has a different identity.
(2) The original mind's likelihood of discharging it's long-term drives is diminished to the extent that it believes the emulation will have a different identity.
(3) The original mind's likelihood of discharging it's long-term drives is probably higher in option 1 (gradual mind transfer), because gradual mind transfer makes it easier to think of the emulation as having the same identity as the original.
(4) The original mind's likelihood of discharging it's short-term drives is probably higher in option 1. It avoids violent destruction and allows the mind to believe, to some extent, that it's short-term drives will continue be fulfilled entirely.

Conclusion from this solution:
If the original mind believes that options 2 or 3 produce an emulated mind with a different identity from the original mind, then the original mind should and probably will pick option 1.

Another solution exists where the original mind absolutely believes the emulated mind will have the same identity as the original, as long as both minds do not exist at the same time. In such a case, option 2 should be attractive. If the original mind could be convinced that the emulated mind will have the same identity, and the death-by-rifle is absolutely considered a "bad dream", then option 3 should be irresistible.

Edited by exapted, 06 October 2009 - 08:54 AM.


#6 ben951

  • Guest
  • 111 posts
  • 15
  • Location:France

Posted 06 October 2009 - 02:36 PM

I would be pleased If someone can clarify something for me because I wasn't able to find that information on the web maybe it is not know yet.
Do we replace all our neuron during our life ?
we know neurogenesis exist and that we create new neuron even at old age but are all of them replaced during life ?
I also red somewhere that atoms of our body are replaced every 7 year or so, is there scientific evidence to support that ?

If one of those assertions is true, to me it means that we are already all the time slowly uploading our self in another biological substrate and that it's a natural phenomenon.

Some people say then, that we are not the same person we were 7 years ago, but in that case we can also say that we are not the same person we were yesterday since our experience during life change our memory and then who we are all the time.
Every time we learn something we are different thereafter.
We are pattern of information's contently evolving, changing, "dying!?"

Than I would like my biological neurons to be replaced by non biological "immortal" one at the same rate nature does (at least as a fist step before maybe uploading to a sub strait that works completely differently from a biological brain)


Edited by ben951, 06 October 2009 - 02:42 PM.


#7 Vgamer1

  • Guest, F@H
  • 763 posts
  • 39
  • Location:Los Angeles

Posted 06 October 2009 - 02:59 PM

I agree with forever freedom. Option 1 *might* give me the best odds of survival depending on the circumstances, although the way you've spelled it out doesn't sound like I'd be surviving.

With the options you've presented I'd probably pick option 3, since it seems to me that I wouldn't survive any of the scenarios you've presented. So I guess giving a copy of myself lots of money is the closest thing to my happiness, even though I wouldn't be aware of it.

The only way I would willingly upload myself is in the following way:

1) Link my brain up to a computer.

2) Have the computer copy my consciousness onto some kind of hardware/software keeping my biological body alive the whole time.

3) Somehow use the computer I am connected with to link my biological consciousness to the hardware/software consciousness so that I am experiencing both consciousnesses simultaneously via the computer interface.

4) Sever the link between the computer and my biological consciousness - but keep my biological body alive.

I'm not really sure what would happen, but I think I'd be conscious of only the hardware/software part at this point... It's an interesting thought experiment and would be an even more interesting actual experiment. What do you guys think would happen?

I would think this scenario would give me the most likely odds of survival, but I'm not 100% sure. We need more scientific knowledge of consciousness and computers before I'd be willing to do something like this.

Edited by Vgamer1, 06 October 2009 - 03:01 PM.


#8 forever freedom

  • Guest
  • 2,362 posts
  • 67

Posted 06 October 2009 - 03:45 PM

What matters is deciding which of these 3 options has the highest chance of allowing my awareness to continue; i suppose that would be option 1.

Could you explain why option 1 would give you a higher chance of allowing your awareness to continue than option 2 or 3? To you it may be obvious, but to others it may not be.


I thought option 1 was supposed to be the one who gave the highest chances of continuation of awareness? I just assumed that the objective of this poll was to determine what monetary value people place on the risk to their lives/existence, considering an eternal existence. For me, no money compensates risk (to an eternal existence) so i'd choose the option with the highest chances of survival, regardless of money involved. After all, when living forever i'd be able to accumulate as much money as i'd ever want so it would be much, much less important than it is now, when time to accumulate money is limited.

Edited by forever freedom, 06 October 2009 - 03:50 PM.


#9 exapted

  • Topic Starter
  • Guest
  • 168 posts
  • 0
  • Location:Minneapolis, MN

Posted 06 October 2009 - 10:37 PM

What matters is deciding which of these 3 options has the highest chance of allowing my awareness to continue; i suppose that would be option 1.

Could you explain why option 1 would give you a higher chance of allowing your awareness to continue than option 2 or 3? To you it may be obvious, but to others it may not be.


I thought option 1 was supposed to be the one who gave the highest chances of continuation of awareness? I just assumed that the objective of this poll was to determine what monetary value people place on the risk to their lives/existence, considering an eternal existence. For me, no money compensates risk (to an eternal existence) so i'd choose the option with the highest chances of survival, regardless of money involved. After all, when living forever i'd be able to accumulate as much money as i'd ever want so it would be much, much less important than it is now, when time to accumulate money is limited.

Yes you're right. I guess I was just curious if you have any sort of mechanism or metaphysical claims in mind.

The dilemma sounds weird, but who knows, maybe some people will face medical dilemmas that are roughly analogous.

#10 exapted

  • Topic Starter
  • Guest
  • 168 posts
  • 0
  • Location:Minneapolis, MN

Posted 06 October 2009 - 10:57 PM

I agree with forever freedom. Option 1 *might* give me the best odds of survival depending on the circumstances, although the way you've spelled it out doesn't sound like I'd be surviving.

With the options you've presented I'd probably pick option 3, since it seems to me that I wouldn't survive any of the scenarios you've presented. So I guess giving a copy of myself lots of money is the closest thing to my happiness, even though I wouldn't be aware of it.

The only way I would willingly upload myself is in the following way:

1) Link my brain up to a computer.

2) Have the computer copy my consciousness onto some kind of hardware/software keeping my biological body alive the whole time.

3) Somehow use the computer I am connected with to link my biological consciousness to the hardware/software consciousness so that I am experiencing both consciousnesses simultaneously via the computer interface.

4) Sever the link between the computer and my biological consciousness - but keep my biological body alive.

I'm not really sure what would happen, but I think I'd be conscious of only the hardware/software part at this point... It's an interesting thought experiment and would be an even more interesting actual experiment. What do you guys think would happen?

I would think this scenario would give me the most likely odds of survival, but I'm not 100% sure. We need more scientific knowledge of consciousness and computers before I'd be willing to do something like this.

It sounds like you are sort of a reluctant dualist, because there is risk involved and we simply don't have scientific evidence to prove otherwise. But it also sounds like you believe science could resolve the issue.

What I mean is that it seems you believe your mind is possibly something different from simply the patterns in your physical brain. Or, maybe you believe that the mind could actually be the patterns, but that there could be some kind of super-pattern that we need to actually "pop" out of the brain and pop into the new mind through a special process.

I'm wondering if there are experiments that could be done, that could help to scientifically define the mind, and determine what happens to the mind in cases such as the ones in this dilemma. I would think when brain scanning tech gets much better, we could scan the brain for certain kinds of mathematical patterns, and possibly mathematically define terms such as "awareness". But I tend to think that the general claim of dualism is not falsifiable, by definition.

#11 Vgamer1

  • Guest, F@H
  • 763 posts
  • 39
  • Location:Los Angeles

Posted 12 October 2009 - 05:26 PM

It sounds like you are sort of a reluctant dualist, because there is risk involved and we simply don't have scientific evidence to prove otherwise. But it also sounds like you believe science could resolve the issue.

What I mean is that it seems you believe your mind is possibly something different from simply the patterns in your physical brain. Or, maybe you believe that the mind could actually be the patterns, but that there could be some kind of super-pattern that we need to actually "pop" out of the brain and pop into the new mind through a special process.

I'm wondering if there are experiments that could be done, that could help to scientifically define the mind, and determine what happens to the mind in cases such as the ones in this dilemma. I would think when brain scanning tech gets much better, we could scan the brain for certain kinds of mathematical patterns, and possibly mathematically define terms such as "awareness". But I tend to think that the general claim of dualism is not falsifiable, by definition.


I'm not a dualist. I believe the mind is simply brain patterns.

Consider the experiment I laid out in my last post. What do you think would happen in the scenario?

1) Link my brain up to a computer.

2) Have the computer copy my consciousness onto some kind of hardware/software keeping my biological body alive the whole time.

3) Somehow use the computer I am connected with to link my biological consciousness to the hardware/software consciousness so that I am experiencing both consciousnesses simultaneously via the computer interface.

4) Sever the link between the computer and my biological consciousness - but keep my biological body alive.


Which consciousness would I be experiencing after severing the link? Maybe this experiment would answer some questions about consciousness like you're wondering.

EDIT: As for your original question. I still need some clarification. What exactly do you mean by "gradual"? Do you mean that my neurons would be replaced by silicon chips piece by piece? Or simply that the upload process would be done over time instead of instantly?

If it would be done as more of an upload, then I stick with my answer of needing the linking process before doing it, although, I don't really know what would happen.

Edited by Vgamer1, 12 October 2009 - 05:32 PM.


#12 eternaltraveler

  • Guest, Guardian
  • 6,471 posts
  • 155
  • Location:Silicon Valley, CA

Posted 12 October 2009 - 11:46 PM

Which consciousness would I be experiencing after severing the link


the problem with your thought experiment is that you don't consider that both could be equally a continuation of the present you, which then begin to diverge from eachother immediately after severing the link.

#13 Vgamer1

  • Guest, F@H
  • 763 posts
  • 39
  • Location:Los Angeles

Posted 13 October 2009 - 12:04 AM

the problem with your thought experiment is that you don't consider that both could be equally a continuation of the present you, which then begin to diverge from eachother immediately after severing the link.


That's exactly what I'm talking about! I don't know which one I would be after the link is severed. That's the question I'm posing.

#14 lunarsolarpower

  • Guest
  • 1,323 posts
  • 53
  • Location:BC, Canada

Posted 13 October 2009 - 12:25 AM

the problem with your thought experiment is that you don't consider that both could be equally a continuation of the present you, which then begin to diverge from eachother immediately after severing the link.


That's exactly what I'm talking about! I don't know which one I would be after the link is severed. That's the question I'm posing.


I think you would be both. Further, I think you should be able to rejoin the consciousnesses down the line, making the determination easy even for those of us with a current perspective of who is actually you. However there is no reason you couldn't merge consciousnesses with others as well which will create all kinds of conundrums for those using the cleanly delineated biological definitions of identity and personhood.

#15 Vgamer1

  • Guest, F@H
  • 763 posts
  • 39
  • Location:Los Angeles

Posted 13 October 2009 - 12:41 AM

I think you would be both. Further, I think you should be able to rejoin the consciousnesses down the line, making the determination easy even for those of us with a current perspective of who is actually you. However there is no reason you couldn't merge consciousnesses with others as well which will create all kinds of conundrums for those using the cleanly delineated biological definitions of identity and personhood.


What does it mean to be both? Being aware of both consciousnesses at once? Or something else?

If you would be both, then what would "rejoining" them accomplish?

#16 eternaltraveler

  • Guest, Guardian
  • 6,471 posts
  • 155
  • Location:Silicon Valley, CA

Posted 13 October 2009 - 01:04 AM

the problem with your thought experiment is that you don't consider that both could be equally a continuation of the present you, which then begin to diverge from eachother immediately after severing the link.


That's exactly what I'm talking about! I don't know which one I would be after the link is severed. That's the question I'm posing.


No. That is not what you are talking about. If it were you wouldn't wonder which "one" you'd be. When a cell divides which "one" is the original cell? Any answer would be a Non sequitur

Edited by eternaltraveler, 13 October 2009 - 01:13 AM.


#17 Vgamer1

  • Guest, F@H
  • 763 posts
  • 39
  • Location:Los Angeles

Posted 13 October 2009 - 03:49 AM

No. That is not what you are talking about. If it were you wouldn't wonder which "one" you'd be. When a cell divides which "one" is the original cell? Any answer would be a Non sequitur


So what's your answer?

#18 DJS

  • Guest
  • 5,798 posts
  • 11
  • Location:Taipei
  • NO

Posted 13 October 2009 - 04:46 AM

the problem with your thought experiment is that you don't consider that both could be equally a continuation of the present you, which then begin to diverge from eachother immediately after severing the link.


That's exactly what I'm talking about! I don't know which one I would be after the link is severed. That's the question I'm posing.


I think you would be both. Further, I think you should be able to rejoin the consciousnesses down the line, making the determination easy even for those of us with a current perspective of who is actually you. However there is no reason you couldn't merge consciousnesses with others as well which will create all kinds of conundrums for those using the cleanly delineated biological definitions of identity and personhood.


Yes, this is where my 'patternist' intuitions lead me as well.

What matters is deciding which of these 3 options has the highest chance of allowing my awareness to continue; i suppose that would be option 1.

Could you explain why option 1 would give you a higher chance of allowing your awareness to continue than option 2 or 3? To you it may be obvious, but to others it may not be.


I thought option 1 was supposed to be the one who gave the highest chances of continuation of awareness? I just assumed that the objective of this poll was to determine what monetary value people place on the risk to their lives/existence, considering an eternal existence. For me, no money compensates risk (to an eternal existence) so i'd choose the option with the highest chances of survival, regardless of money involved. After all, when living forever i'd be able to accumulate as much money as i'd ever want so it would be much, much less important than it is now, when time to accumulate money is limited.


While there is a degree of existential risk in scenarios 2 and 3 (since there might be some essential component of identity which resides on a yet to be discovered layer of reality... basically this assessment item is an acknowledgement of our incomplete knowledge regarding reality), there is also a degree of existential risk in not being properly positioned to take advantage of technological trends as they come along.

If, as is the case in this hypothetical, we're at the point of atomically precise duplication and substrate transfers, then I would probably find myself becoming much more Kurzweilian in my perspective (in terms of rates of progress - not dangerously delusional utopian technophilia). If I came to the conclusion that "the singularity is near", I would also believe that I was entering a volatile and dangerous time in history. It would be difficult, if not impossible, to determine the consequences of someone attaining post human status before me. This would cause my existential risk tolerance on other matters to increase. My overriding priority would be attaining post human status as quickly as possible. Hence, a billion dollars and a non-zero risk substrate transfer would be an obvious choice for me.

Of course, all of this is pie-in-sky, but it fits in well with outlandish hypotheticals.

#19 exapted

  • Topic Starter
  • Guest
  • 168 posts
  • 0
  • Location:Minneapolis, MN

Posted 13 October 2009 - 07:45 AM

Which consciousness would I be experiencing after severing the link


the problem with your thought experiment is that you don't consider that both could be equally a continuation of the present you, which then begin to diverge from eachother immediately after severing the link.

That is an interesting way of thinking about it. I think they could both be full continuations of the present you, depending on your definition of "continuation" or "you". However once the two start to diverge, we could consider one to be closer in identity to the original than is the other. I think we need a new definition of "you", a criteria for identity.

Ideas that the original has, which are conditional on which of the two is considering the idea, have the potential to make the two diverge in ways that relate to the drives of the original mind, and are caused by the mind replication procedure itself. For example if you strongly believe that replica minds lack something special and undefined, or that replica minds are insults to the life of the original, it could directly transform the drives of the replica mind. So from the perspective of the original, the replication procedure could sort of rob you of the fruition of some of your drives, or at least the beliefs you have about the procedure has the potential to (in concert with the procedure) rob you of your identity.

#20 Vgamer1

  • Guest, F@H
  • 763 posts
  • 39
  • Location:Los Angeles

Posted 13 October 2009 - 02:47 PM

That is an interesting way of thinking about it. I think they could both be full continuations of the present you, depending on your definition of "continuation" or "you". However once the two start to diverge, we could consider one to be closer in identity to the original than is the other. I think we need a new definition of "you", a criteria for identity.


eternaltraveler and I have a difference of opinion here. It's a bit of a confusion of terms that is difficult to sort out.

At the point of the severance, both would be a "continuation" of "you," and I agree that they would start to diverge. The question I'm considering is a different one though. I'm trying to think about which person I would be conscious of - the original, the copy, both, or neither. I'm considering all of the possibilities.

Now, I can't really be sure which one I would be conscious of.

What I haven't brought up in this thread I don't think is the option of gradually replacing parts of an organic brain with inorganic parts as a method of uploading. I believe this is actually the safest option. If I could take the gradual replacement option with no money, I would take it. However, exapated, your first poll option isn't quite clear on what "gradual transfer" means. Is it a replacement process? Or is it a more traditional uploading but just "slowly"?

To me, it doesn't really make a difference if a traditional upload is done slowly, quickly, or instantly - in my opinion they would all result in a copy of me existing on a computer and me still sitting in my biological brain. I would still not want the biological me to be killed and neither would my copy.

Edited by Vgamer1, 13 October 2009 - 02:47 PM.


#21 eternaltraveler

  • Guest, Guardian
  • 6,471 posts
  • 155
  • Location:Silicon Valley, CA

Posted 13 October 2009 - 06:10 PM

No. That is not what you are talking about. If it were you wouldn't wonder which "one" you'd be. When a cell divides which "one" is the original cell? Any answer would be a Non sequitur


So what's your answer?


fish

non sequitur

Edited by eternaltraveler, 13 October 2009 - 06:11 PM.


#22 DJS

  • Guest
  • 5,798 posts
  • 11
  • Location:Taipei
  • NO

Posted 13 October 2009 - 08:26 PM

When I witness all of this handwringing over some intangible, completely undefined aspect of identity the phrase that comes to mind is 'residual soul psychology'. It takes great strength of will to, first, recognize that (objectively) you are not a unique and special snowflake and, second, to get on with living and striving to live indefinitely despite this apparent fact of reality.

Perhaps someone could explain precisely what they mean by continuity of consciousness, and also how this concepts differs from that of the traditional conception of 'soul'.

#23 DJS

  • Guest
  • 5,798 posts
  • 11
  • Location:Taipei
  • NO

Posted 13 October 2009 - 09:39 PM

One of the thoughts which I frequently have on this topic is, assuming that intuition will continue to differ - sometimes strongly - on questions of identity, in a posthuman future would staunch patternists have a competitive advantage over nonpatternists who are unwilling to be duplicated/rejoined/possess a disjointed-collective consciousness?

Edited by DJS, 13 October 2009 - 09:40 PM.


#24 Vgamer1

  • Guest, F@H
  • 763 posts
  • 39
  • Location:Los Angeles

Posted 13 October 2009 - 11:32 PM

Perhaps someone could explain precisely what they mean by continuity of consciousness, and also how this concepts differs from that of the traditional conception of 'soul'.


Hmmm... Hard to explain "precisely" and therein lies the problem. Sleeping, for example. I'm not sure if I have continuity of consciousness from the time before I sleep to the time I wake up.

To me, the patternist view is just as much soul-derived as my view - or is equally not derived. To put it in "soul" terms, the patternist believes that when a copy is created, the "soul" is re-inserted or transfered into the copy. This makes little to no sense to me, but again, I believe it is more of a confusion of terms than an actual debate.

I usually put it into these simple terms: If an exact copy of me is created, I will not be aware of the copy's consciousness. I will only be aware of the original's consciousness - my consciousness. If we then kill the original, then my awareness will cease and the copy will live on, but without my awareness. That's about a "precisely" as I can put it.

Forgive me if I've misrepresented the patternist view, but I believe the patternist view is that it doesn't matter that a copy lives on while the original dies because it will be - for all intents and purposes - the same consciousness. This is where the disagreement lies. I believe it is in essence the same consciousness, but what matters to me is which consciousness I am aware of.

Edited by Vgamer1, 13 October 2009 - 11:35 PM.


#25 eternaltraveler

  • Guest, Guardian
  • 6,471 posts
  • 155
  • Location:Silicon Valley, CA

Posted 14 October 2009 - 12:40 AM

To put it in "soul" terms, the patternist believes that when a copy is created, the "soul" is re-inserted or transfered into the copy


nonsense. There is nothing like a soul to transfer to begin with. The patternist view holds that you are your pattern. Period. If you make a copy of that pattern and destroy the original at the same moment you still live.

You better hope the patternist view is correct otherwise we all die over the course of every year as almost all the atoms in our body outside of our bones are replaced in that period (all the atoms in your bones are replaced about every 7 years). We are copies of our former selves.

#26 eternaltraveler

  • Guest, Guardian
  • 6,471 posts
  • 155
  • Location:Silicon Valley, CA

Posted 14 October 2009 - 12:43 AM

you are not a unique and special snowflake


aren't snowflakes just pretty patterns? I'm a unique and special pattern until such time as this snowflake starts making trillions of copies. Can't speak for you :-D

#27 DJS

  • Guest
  • 5,798 posts
  • 11
  • Location:Taipei
  • NO

Posted 14 October 2009 - 04:23 AM

you are not a unique and special snowflake


aren't snowflakes just pretty patterns? I'm a unique and special pattern until such time as this snowflake starts making trillions of copies. Can't speak for you :-D


Notice right before that snippet I wrote 'objectively'. Depending on your metaphysical outlook, the term 'theoretical' could also have been inserted. There is nothing inherently unique in any pattern of information if it can in principle be duplicated. Now, regarding 'special' you may have a point, but please try to indulge me at least a little when I'm quoting fight club.

#28 Vgamer1

  • Guest, F@H
  • 763 posts
  • 39
  • Location:Los Angeles

Posted 14 October 2009 - 05:05 AM

nonsense. There is nothing like a soul to transfer to begin with. The patternist view holds that you are your pattern. Period. If you make a copy of that pattern and destroy the original at the same moment you still live.


Yes, I agree that souls don't exist - I'm not a dualist or religious. I was just making the point to DJS that it's kinda silly to say that all nonpatternists believe in souls by default.

You better hope the patternist view is correct otherwise we all die over the course of every year as almost all the atoms in our body outside of our bones are replaced in that period (all the atoms in your bones are replaced about every 7 years). We are copies of our former selves.


This another example of the dilemma I'm talking about. I'm not sure that I'm still me from moment to moment or from year to year. I could be dying at each quantum instant, and I see your point, but we are by no means "copies" of our former selves. Sorry, but I must pick this apart a bit. Like you say, the atoms in our bodies are constantly being replaced by new ones. Me one year from now is definitely not a copy of me now.

The issue I have is with "instant" copying. Or rather complete destruction of the original and recreation of a new entity that is the same.

I'm still waiting for a response from patternists for my question about which entity you are aware of. If I replicate you, you will not be aware of the replica - you will only be aware of you the original. Then if I kill the original you, you will be dead along with your consciousness while the replica continues as a new consciousness. Which one are you aware of? The only answer in my eyes is the original.

The condition of killing the original "instantly" or "as soon as the copy is made" seems very arbitrary to me. And how is this going to be accomplished? Simultaneity is very hard to pull off if not impossible, especially when accounting for relativity which factors in even on small scales if you really do want "instant."

I'd really like an answer for this, but usually the issue I'm talking about seems to get sidestepped by patternists. I don't know why. Eternaltraveler likes to just say "nonsequitor" instead of giving an actual response.

It's really not my problem though. If patternists really want to kill themselves many times over because "they won't be aware of it" that's fine. You can laugh at that and say I'm disadvantaging myself in the posthuman world, but that's my current stance, sorry. If some future experiment can solve the debate, which may be possible, I will remain open-minded. Potentially my experiment could answer the question, but eternaltraveler seem to think it's a "nonsequitor" even though it is by all means very possible.

Just as a side question, I'm assuming you (eternaltraveler and DJS) would use a star trek style teleporter, yes?

#29 Luna

  • Guest, F@H
  • 2,528 posts
  • 66
  • Location:Israel

Posted 14 October 2009 - 05:13 AM

O_o

Once you believe (and I don't seem to) that the instant cut and paste version is really you, then it leaves only three as an option because you'd regenerate anyways..

But since I don't believe it's really you, option one is the only way to go.

#30 DJS

  • Guest
  • 5,798 posts
  • 11
  • Location:Taipei
  • NO

Posted 14 October 2009 - 05:48 AM

Perhaps someone could explain precisely what they mean by continuity of consciousness, and also how this concepts differs from that of the traditional conception of 'soul'.

Forgive me if I've misrepresented the patternist view, but I believe the patternist view is that it doesn't matter that a copy lives on while the original dies because it will be - for all intents and purposes - the same consciousness. This is where the disagreement lies. I believe it is in essence the same consciousness, but what matters to me is which consciousness I am aware of.


But 'aware of' is exactly the same as 'conscious of'. Therefore, what you're actually saying is that you define your identity by the consciousness which you are conscious of. :-D

Still, I understand what you're trying to convey because it is the other position in identity and duplication debates. What you're arguing is that point of view (POV) is what matters. This position has strong intuitive appeal because anyone of us can imagine a hypothetical scenario where we confront our duplicate and see that they are another individual.

The alternative to the POV intuition is a second, newer intuition which results from us utilizing our intellect and the expanding knowledge we're acquiring about our own nature. A book which I like recommending for an insightful and original look at the philosophical implications of functionalism is Being No One by Metzinger. If you don't feel like dropping the money for the book there are also a few Metzinger video presentations online.

Basically what I'm trying to claim is that, once you have a solid theoretical understanding of functionalism, you will understand that your point of view intuition is inferior, on intellectual grounds, to the patternist intuition. Whether you believe that overriding an intuition on intellectual grounds is a valid course of action is another issue entirely, and one which depends in part upon one's preexisting values. Although it should be noted that, if cognitive science has taught us anything, it's that our intuitions can often lead us far astray. But here I am appealing to the value of the intellect on intellectual grounds...

To restate my position in clearer terms; identity is the sum total of such things as personality, intellect, memories, etc - all of which are physically encoded in our brains. It is a static concept. Consciousness is not a component of identity, but it is vitally important for the existence of identity. (I'm having a difficult time coming up with analogy to convey this idea) The process of consciousness allows our identity to interact with itself and it's environment, thereby evolving with time.

On a side note, I view identity combined with consciousness as being a dynamic concept which I refer to as 'Being' or 'Becoming'.

Edited by DJS, 14 October 2009 - 07:14 AM.





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users