• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
- - - - -

Which of these philosophies do you believe?


  • Please log in to reply
92 replies to this topic

Poll: Philosophy A or B? (47 member(s) have cast votes)

Which Philosophy?

  1. Philosophy A (30 votes [63.83%])

    Percentage of vote: 63.83%

  2. Philosophy B (17 votes [36.17%])

    Percentage of vote: 36.17%

Vote Guests cannot vote
⌛⇒ MITOMOUSE has been fully funded!

#1 Vgamer1

  • Guest, F@H
  • 763 posts
  • 39
  • Location:Los Angeles

Posted 25 February 2009 - 08:54 AM


It is a bit difficult to describe the two philosophies I'm trying to differentiate here. I think the best way to describe them is with an example...

So let's say your consciousness is being recorded constantly onto a "backup" of sorts, so that at every instant the "backup" matches your exact consciousness. Now let's say you are killed somehow so that your original consciousness is completely destroyed. Now, your "backup" was not destroyed, and at the exact moment of your death your "backup" is booted up and starts running. Here's the conflict...

Philosophy A:

You are dead. You not longer hear, taste, smell, see, or feel anything. The "backup" of you is just a copy and nothing more. You will still be dead despite there being an exact copy of your consciousness existing.

Philosophy B:

You are still alive. You still hear, taste, smell, see and feel anything that you normally would. You exist as your "backup" just as you did before. You will be alive as long as you still have a "backup" to boot up when you die.


I personally strongly believe in Philosophy A. It seems that people have strong opinions no matter which side they are on. I'm curious to see how many are on each side. I'll say more about what I think as the thread grows.

So which do you believe? Philosophy A or philosophy B?

#2 JLL

  • Guest
  • 2,192 posts
  • 161

Posted 25 February 2009 - 09:21 AM

Good question.

If philosophy B was correct, what would happen if my backup was booted up in another vessel while my original self is still alive? Would there be two of me?

Perhaps we should eventually make an experiment to see which one is true, though I'm not sure if the experiment could prove or disprove these theories. At this point, I would say a backup is a lot better than nothing, but I would still prefer to protect my original self from dying.

#3 valkyrie_ice

  • Guest
  • 837 posts
  • 142
  • Location:Monteagle, TN

Posted 25 February 2009 - 10:40 AM

This grew out of a conversation Vgamer and I had on the USstream chat. Sadly I wish it had allowed us to go back and simply post the chat as many of the possibilities were discussed.

I am strongly B. In my view, who I am is the sum total of my thought processes and memories. As such I view preservation of data as the essential factor. So long as the backup was 99.99999999999999999999999999999999999999999999999999999999999999999999999% accurate, missing no more than say, the actual instant of my demise, I would view the backup as me. The missing milliseconds would be no more than the same kind of unconsciousness as that induced by anesthesia.

In the case of a backup being booted up while the "original" is still in existance, I feel that both of "us" would be "me". At the point where we diverged, we would both have seperate experiences, and we would both consider ourselves "originals". Additionally, we would likely exchange our memories regularly so that we both would be able to share our mutual experiences. In truth, I would probably deliberately create a few copies of myself, in both genders and in several different forms, and regularly reintegrate our memories so that we could all experience the totality of our existance, and eventually decide which of us is my preferred existance, if we don't decide to remain a multiplicity.

In the case of the loss of the "original" I would view the back up as myself, even if the back up was missing a few hours of my existance, for example, if my "original" was blown to bits by a atomic bomb, I certainly would not choose to recall the experience.

Additionally in the case of a transporter of the Star Trek variety, I view both the me who is disassembled and the me who is reassembled as *me* In the case of a copy remaining, as above we would simply exist seperately until such a time as we could reintegrate our data.

As I see the universe, we exist as data. According to hyperstring theory, all of our reality is simply the expression of complex patterns of energy into our 4 dimensional space. Since my consciousness it the emergent phenomina that exists as a result of the complexity of this pattern, so long as the pattern of my consciousness exists, *I* exist.

Another example would be would I view a copy of me which is overwritten onto someone other than myself as *me*, and I have to say that yes, it would, because if *I* overwrote the other person entirely, he would have then ceased to exist. And if the "original me" were to die the other me would survive, and still be *ME*. (actually, it occurs to me that HP Lovecraft anticipated this in The Thing on the Doorstep)

In a sense, I view this is a issue of whether the hardware is essential to the existance of *me* or if *I* exist merely as the data which makes up my consciousness. So I suggested Hardware centrism verses Data Centrism as a possible name, as I do not know if there are pre existing names for these schools of thought.

sponsored ad

  • Advert
Advertisements help to support the work of this non-profit organisation. [] To go ad-free join as a Member.

#4 Luna

  • Guest, F@H
  • 2,528 posts
  • 66
  • Location:Israel

Posted 25 February 2009 - 03:17 PM

I believe that A is correct.
Because you can just backup and then copy paste into a clone, it isn't you.

I also don't believe that consciousness truly exists.

Edited by Winterbreeze, 25 February 2009 - 03:18 PM.


#5 forever freedom

  • Guest
  • 2,357 posts
  • 68

Posted 25 February 2009 - 10:30 PM

I think option B is correct.

I believe that A is correct.
Because you can just backup and then copy paste into a clone, it isn't you.


Why not? I think that if this happened, there would be 2 of you. Now the two of you would start having different experiences and you two would develop into different beings, the same way we are constantly developing into different "us" as time passes and the environment we are in changes us both physically and mentally.


I also don't believe that consciousness truly exists.


What do you mean? How can consciousness not exist?

#6 Luna

  • Guest, F@H
  • 2,528 posts
  • 66
  • Location:Israel

Posted 26 February 2009 - 04:58 AM

I think option B is correct.

I believe that A is correct.
Because you can just backup and then copy paste into a clone, it isn't you.


Why not? I think that if this happened, there would be 2 of you. Now the two of you would start having different experiences and you two would develop into different beings, the same way we are constantly developing into different "us" as time passes and the environment we are in changes us both physically and mentally.


I also don't believe that consciousness truly exists.


What do you mean? How can consciousness not exist?


Well think you're just copied.
You are you, you can see the other you in front of you, you can't feel see or hear anything from his senses, therefore you are not him.
Now for whatever reason you die. the other you still exists, he is a replica but he is not you, or should I say, you are not him, you stopped sensing, hearing, feeling, you are dead.
Therefore no, a replica is not *you*, it is a copy of you, but it won't save you from dying, the idea is just silly.
It is almost as silly as "living through the offsprings".

As for consciousness , we are just biological robots, there is nothing magical or special like consciousness, we feel and sense through our senses.
Our thoughts are just chemical processes.
I am not sure how much choice we *really* have in life, we may seem to have a choice because we believe that we are thinking and choosing.
But isn't that just a biological thing? it might be a bit more loose so we do have some choice, as an organism.
But consciousness is no where in this formula, there is no spirit in the body, there is just a bio-robot, simple as that.

#7 valkyrie_ice

  • Guest
  • 837 posts
  • 142
  • Location:Monteagle, TN

Posted 26 February 2009 - 06:24 AM

I think option B is correct.

I believe that A is correct.
Because you can just backup and then copy paste into a clone, it isn't you.


Why not? I think that if this happened, there would be 2 of you. Now the two of you would start having different experiences and you two would develop into different beings, the same way we are constantly developing into different "us" as time passes and the environment we are in changes us both physically and mentally.


I also don't believe that consciousness truly exists.


What do you mean? How can consciousness not exist?


Well think you're just copied.
You are you, you can see the other you in front of you, you can't feel see or hear anything from his senses, therefore you are not him.
Now for whatever reason you die. the other you still exists, he is a replica but he is not you, or should I say, you are not him, you stopped sensing, hearing, feeling, you are dead.
Therefore no, a replica is not *you*, it is a copy of you, but it won't save you from dying, the idea is just silly.
It is almost as silly as "living through the offsprings".

As for consciousness , we are just biological robots, there is nothing magical or special like consciousness, we feel and sense through our senses.
Our thoughts are just chemical processes.
I am not sure how much choice we *really* have in life, we may seem to have a choice because we believe that we are thinking and choosing.
But isn't that just a biological thing? it might be a bit more loose so we do have some choice, as an organism.
But consciousness is no where in this formula, there is no spirit in the body, there is just a bio-robot, simple as that.



In order to clarify my position and explain why I feel that a copy of me is still me, perhaps it will help if I explain how I understand the universe to work.

To begin, while superstring theory is not yet proven I do feel it explains our universe in a pretty comprehensive and to me, fairly straightforward manner. Superstring theory in its simplest terms states that only energy is “real”. Regardless of the kind of energy, at it’s basics energy is energy is energy. Every kind of energy is vibration that occurs in the superstring.

Whether there is one superstring, many superstrings, or no superstrings, is somewhat irrelevant. The vibrational energy “behaves” as if it is vibrating on a looped string, so it makes a convenient visualization to look at the basic structure of the universe as a matrix of strings packed in a regular grid array, all of them vibrating to a greater or lesser degree,

Now, since these strings are multidimensional, and vibrate in many more than 3 dimensions, the energy vibrating in them is also multidimensional. All elementary particles are simply vibrations in the string. Stable particles such as electrons, neutrons, and protons are vibrations which are static, in other words they are a vibration in a frequency which is some multiple of the diameter of the string, so that the wave form is a self reinforcing sine. All other particles are a non stable wave frequency, so that their wave form cancels itself out in short order. Also since the strings are extremely close together, the vibrations of one string will also cause the strings around it to vibrate as well, so any particle can be looked at as a group of strings, all vibrating in the same frequency. Because the energy freely travels from string to string, it is impossible to state exactly which string is THE string which initiated the vibration, and so we have indeterminacy.

Thus, we can look at any particle not only as a particle, which it is because it exists on a given string, but as a wave form, because the energy is resonating between many strings at once.

Therefore, the simplest way to look at a particle is as a spherical Field of Probability, basically a sphere-like bulls eye. In the center you have the zone of 100% likelihood , the group of strings in which the energy exists, surrounded by zones of decreasing likelihood, following an inverse logarithmic formula. Thus the field extends to infinity, but is strongest in the immediate vicinity of particle. As every particle has this field of probability, every particle has some effect on every other particle, though this effect is negligible over any significant distance.

I could at this point go on into applying this to the structure of the universe, but the point I wish to illustrate is that when viewed from this perspective, it becomes pretty obvious that the energy in superstrings is arrayed in PATTERNS. These simple patterns are the basic building blocks of everything in the universe, and it is these patterns that can basically be viewed in much the same way as binary code, in other words as the programming language of the universe.

Everything that exists is composed of these patterns, connected together in varying degrees of complexity. Just as 0 and 1’s are used to build enormously complex computer programs, our universe uses the patterns of Positive Negative and Neutral as a trinary code to build the infinite complexity we see all around us. Everything we see, feel, experience, and think, is a result of arrangements of the trinary code. As such, it becomes obvious that the Universe exists as an unimaginably vast quantum computer, in which every state is calculated from the result of every previous state.

To me, as a computer program existing inside this marvelous machine, it is obvious that my consciousness is a result of the massively complex program of trinary data which holds an enormous amount of vibrational energy into a very precise pattern. It is this pattern which produces the emergent phenomena of my self awareness. I am a program who is aware if itself as a program inside the machine.

And like any program, I am composed of an enormous number of smaller subroutines, most of which are common to any other program in the machine. Only two things make me unique: the basic subroutine which produces the hardware on which I run, my DNA, and the actual program which processes data, my consciousness. Like a java program, my DNA creates a virtual platform which interfaces with the true hardware of the universe, on which the real me runs. I currently need this virtual platform, because without it, my program cannot run, but this platform is imperfect, it is filled with numerous bugs and less than optimal subroutines, all of which tend to accumulate errors as the universe continues to calculate its never-ending quantum state. As such, while it is adequate to run my consciousness, it can only do so for a limited time before accumulated errors causes a systems crash.

Unfortunately, since my data is stored and processed solely within this virtual machine, when the machine crashes, my data is erased. While it is possible that some elements of my program may survive as random code in the universal database and be incorporated in another program somewhere, the unique accumulation of patterned data which is my consciousness will not. When the hardware goes, I lose everything.

However, since I AM a program, if my data is saved, and transferred to a new virtual machine, my program will continue to function uninterrupted, just the same as if I transferred the contents of my hard drive from an old computer to a new one.

Therefore, the truly UNIQUE part of ME is the Consciousness program. And regardless of what platform it runs on, so long as it continues intact, I WILL EXIST.

#8 Luna

  • Guest, F@H
  • 2,528 posts
  • 66
  • Location:Israel

Posted 26 February 2009 - 07:01 AM

valkyrie_ice, what if your system doesn't crash but the data is copied to another hardware?
Not cut-paste, but copy-paste.

Can you feel from the other copy's hardware and sense from it?
I say no. So what is the difference that makes it work if you cut instead of copy? aren't you just trapped in a misconception?

#9 Vgamer1

  • Topic Starter
  • Guest, F@H
  • 763 posts
  • 39
  • Location:Los Angeles

Posted 26 February 2009 - 07:02 AM

Well think you're just copied.
You are you, you can see the other you in front of you, you can't feel see or hear anything from his senses, therefore you are not him.
Now for whatever reason you die. the other you still exists, he is a replica but he is not you, or should I say, you are not him, you stopped sensing, hearing, feeling, you are dead.
Therefore no, a replica is not *you*, it is a copy of you, but it won't save you from dying, the idea is just silly.


This is my view exactly. I don't really see how people can see it any other way.

⌛⇒ MITOMOUSE has been fully funded!

#10 valkyrie_ice

  • Guest
  • 837 posts
  • 142
  • Location:Monteagle, TN

Posted 26 February 2009 - 07:30 AM

And I have already answered that question in my first post in this thread. A copy of me existing simultaniously will be me at the moment of creation. We will exist seperately, and continue to experience and process data seperately. And we will both, as ME, seek to share our data with our other self so that OUR experiences can be reintegrated at regular intervals.

What I see in your point of view is elevation of the uniqueness of the HARDWARE over the program running on it. In essence you are stating that the hardware superceeds the program running on it, and that therefore ONLY the hardware matters. To which I must ask how do you justify seeking to modify the hardware, since any modifications will destroy the uniqueness which you are claiming for the hardware.

In essence, you are using a variation of the old religious argument of the inherent specialness of humanity as the "creation" of god, and essentually rehashing the same tired old issues of whether we have "souls."

So let me ask you this, if you created a copy which was capable of instaniously syncronizing data, so that essentually you were experienceing BOTH existances simultaniously, and you did so for such a long period of time that you could no longer remember which of your two IDENTICAL bodies was the original, and then decided to merge again, essentually "killing" one of your two bodies, how would you view it?

And understand, both bodies are YOU because you are seeing, feeling, tasting, and experiencing, both simultaniously as a single consciousness. Any thoughts and actions are conducted in parallel, and at no point is your conscious mind not recieving input from both bodies.

#11 Luna

  • Guest, F@H
  • 2,528 posts
  • 66
  • Location:Israel

Posted 26 February 2009 - 09:52 AM

And I have already answered that question in my first post in this thread. A copy of me existing simultaniously will be me at the moment of creation. We will exist seperately, and continue to experience and process data seperately. And we will both, as ME, seek to share our data with our other self so that OUR experiences can be reintegrated at regular intervals.

What I see in your point of view is elevation of the uniqueness of the HARDWARE over the program running on it. In essence you are stating that the hardware superceeds the program running on it, and that therefore ONLY the hardware matters. To which I must ask how do you justify seeking to modify the hardware, since any modifications will destroy the uniqueness which you are claiming for the hardware.

In essence, you are using a variation of the old religious argument of the inherent specialness of humanity as the "creation" of god, and essentually rehashing the same tired old issues of whether we have "souls."

So let me ask you this, if you created a copy which was capable of instaniously syncronizing data, so that essentually you were experienceing BOTH existances simultaniously, and you did so for such a long period of time that you could no longer remember which of your two IDENTICAL bodies was the original, and then decided to merge again, essentually "killing" one of your two bodies, how would you view it?

And understand, both bodies are YOU because you are seeing, feeling, tasting, and experiencing, both simultaniously as a single consciousness. Any thoughts and actions are conducted in parallel, and at no point is your conscious mind not recieving input from both bodies.


Uhh, I wasn't claiming for a soul, I was claiming against it.
I was never claiming for the hardware to be unique either, I was claiming that copy-pasted data, how accurate as it is, doesn't mean that it is the same thing. think continuum.
I am not claiming that I am right, this is just my opinion.

I might have missed it, but I didn't see you claiming for syncing between the two bodies before, how will they act? will each of them be able to think without the other?
You might just be having 2 beings sharing experiences on a wireless network, that's still not you x2, rather then you and another you.

I am one of the people who would be the least to suggest that humans are special, don't claim things just because people don't agree or find logic with your theories, you might be right, people don't have to agree with you. especially in a world where we are so far away from being able to say that we know anything.

I said my opinions, I am not going to return to this discussion ;)

#12 valkyrie_ice

  • Guest
  • 837 posts
  • 142
  • Location:Monteagle, TN

Posted 26 February 2009 - 10:59 AM

Uhh, I wasn't claiming for a soul, I was claiming against it.
I was never claiming for the hardware to be unique either, I was claiming that copy-pasted data, how accurate as it is, doesn't mean that it is the same thing. think continuum.
I am not claiming that I am right, this is just my opinion.


And I did not claim you were speaking for a soul, I said I saw your arguement as akin to the argument in religion for the unique posistion of humanity. Nor was I addressing simply you, as Vgamer and I have been having this discussion in chat previously. I was stating how I saw this viewpoint by restating it as I understood it. At no point did I state that you were incorrect, nor criticize your statements.


I might have missed it, but I didn't see you claiming for syncing between the two bodies before, how will they act? will each of them be able to think without the other?
You might just be having 2 beings sharing experiences on a wireless network, that's still not you x2, rather then you and another you.


Yes, you must indeed have missed it as I did state that my copies would reintegrate memories. I was restating that reintegration process differently to attempt to clarify my views by making the reintegration immediate in answer to your statements about lack of simultanious experience, by trying to clarify if it was the lack of simultainty that was what you saw as a break down in continuity, or if there were other issues. I attempted to state that as clearly as possible in an attempt to gain further understanding of your views. That is the purpose of debate after all.

I am one of the people who would be the least to suggest that humans are special, don't claim things just because people don't agree or find logic with your theories, you might be right, people don't have to agree with you. especially in a world where we are so far away from being able to say that we know anything.

I said my opinions, I am not going to return to this discussion ;)


That is of course your perogrative. My apologies that you took my attempts to gather further information as an attack. It was not my intention.


Edited by valkyrie_ice, 26 February 2009 - 11:00 AM.


#13 100YearsToGo

  • Guest
  • 204 posts
  • 1
  • Location:Netherlands Antilles

Posted 26 February 2009 - 09:13 PM

I voted A.

The copy is not you. They have the same past but not the same potential. You are dead...0 potential. the copy is allive, full potential. If both were alive both would have potential. So you are basically eliminating the future of the original.

Let x be a function of experience and memory for an individual with time as a parameter
Let y be the resulting function of experience after the copy paste.
Let t be time

lets suppose the backup happens at t=1
Just after backup t=10
at t=11 you kill the original without waking him up first of course

obviously x(t) = y(t) for all t<=10

x(t) <> y(t) for all t>=11

what is the difference between x(t) and y(t) for all t>10 ?

The power to experience or to act has ceased for x. The function of experience is not the same.

Obviously if you keep x(t) alive and synchronized with y(t) then and only then will x(t)=y(t) for all t.

#14 Vgamer1

  • Topic Starter
  • Guest, F@H
  • 763 posts
  • 39
  • Location:Los Angeles

Posted 26 February 2009 - 09:29 PM

What 100YearsToGo says is a really good way of putting it, although I still don't think the "B" people will agree somehow.

Edited by Vgamer1, 26 February 2009 - 09:30 PM.


#15 valkyrie_ice

  • Guest
  • 837 posts
  • 142
  • Location:Monteagle, TN

Posted 27 February 2009 - 12:27 AM

What 100YearsToGo says is a really good way of putting it, although I still don't think the "B" people will agree somehow.


I understand his point quite well, however, from the viewpoint of ME, I will continue. As this is a matter of belief and opinion, since we do not yet have a complete enough understanding of the universe to conclusively prove which view may be true, or if they are both equally true, I do not view either of our opinions as inimical to the other. I see the survival of my program as paramount, and I will certainly modify myself in whatever manner is needed in order to maximize the potential survival of my hardware. However I cannot foresee all possibilities, and I know that accidents happen. My viewpoint allows me to accomodate the potential for the unexpected to occur, while still maintaining the survival of what I see as the "real" me.

As I stated to Vgamer1, it is the difference between "Game Over" and "Respawning" in my opinion. Both views are equally valid, in my opinion, and equally logical. All that essentually is different is our perspectives. As I see it, yours is from the viewpoint of you now, whereas I feel I am looking at it from the perspective of me following such a happening. I feel that the Me looking back at the non functioning original will feel a sense of nostalgic loss, in the sense that I would feel for the loss of a long used and comfortable coat or other item which I had long possessed, but I would not feel that "I" had died.

#16 100YearsToGo

  • Guest
  • 204 posts
  • 1
  • Location:Netherlands Antilles

Posted 27 February 2009 - 01:00 AM

What 100YearsToGo says is a really good way of putting it, although I still don't think the "B" people will agree somehow.


I understand his point quite well, however, from the viewpoint of ME, I will continue. As this is a matter of belief and opinion, since we do not yet have a complete enough understanding of the universe to conclusively prove which view may be true, or if they are both equally true, I do not view either of our opinions as inimical to the other. I see the survival of my program as paramount, and I will certainly modify myself in whatever manner is needed in order to maximize the potential survival of my hardware. However I cannot foresee all possibilities, and I know that accidents happen. My viewpoint allows me to accomodate the potential for the unexpected to occur, while still maintaining the survival of what I see as the "real" me.

As I stated to Vgamer1, it is the difference between "Game Over" and "Respawning" in my opinion. Both views are equally valid, in my opinion, and equally logical. All that essentually is different is our perspectives. As I see it, yours is from the viewpoint of you now, whereas I feel I am looking at it from the perspective of me following such a happening. I feel that the Me looking back at the non functioning original will feel a sense of nostalgic loss, in the sense that I would feel for the loss of a long used and comfortable coat or other item which I had long possessed, but I would not feel that "I" had died.



I don't think it is a matter of believe and opinion. Lets keep the software analogy.

My copy of windows xp is not your copy of windows xp. They are seperate entities but the same program. They are not connected to each other in any way. They behave the same in every way but are still seperate entities. When you destroy one copy it does does not live on through the other copy. The original copy is dead!

Edited by 100YearsToGo, 27 February 2009 - 01:05 AM.


#17 RighteousReason

  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 27 February 2009 - 01:02 AM

It is a bit difficult to describe the two philosophies I'm trying to differentiate here. I think the best way to describe them is with an example...

So let's say your consciousness is being recorded constantly onto a "backup" of sorts, so that at every instant the "backup" matches your exact consciousness. Now let's say you are killed somehow so that your original consciousness is completely destroyed. Now, your "backup" was not destroyed, and at the exact moment of your death your "backup" is booted up and starts running. Here's the conflict...

Philosophy A:

You are dead. You not longer hear, taste, smell, see, or feel anything. The "backup" of you is just a copy and nothing more. You will still be dead despite there being an exact copy of your consciousness existing.

Philosophy B:

You are still alive. You still hear, taste, smell, see and feel anything that you normally would. You exist as your "backup" just as you did before. You will be alive as long as you still have a "backup" to boot up when you die.


I personally strongly believe in Philosophy A. It seems that people have strong opinions no matter which side they are on. I'm curious to see how many are on each side. I'll say more about what I think as the thread grows.

So which do you believe? Philosophy A or philosophy B?

(cross-posting)

To use Eliezer's thought exercise... imagine replacing one neuron at a time (or if you are really picky, use a lower scale) with a totally functionally equivalent non-biological component (doing so with the understanding that you replace all the atoms and molecules in your body constantly, so nothing fundamentally different or special is occuring). If you imagine doing it over a long period of time, you would continue thinking, remembering, and behaving as your usual conscious self after any given neuron is replaced. Use the process of induction here- once you have replaced all of the neurons, you are still exactly the same, yet totally digital. Now compress the time component so that everything happens in one instant. There you have it- you have uploaded from a biological to a computational substrate, without losing anything whatsoever.

So taking your argument to the logical conclusion, you would be saying that anytime we lose a particular atom or molecule that we *totally die*, because to replace just one atom with a different one would be a "copy", and thus not technically 'you'. There is your absurdity my friend.

Believe me- there is nothing supernatural about that lump of carbon between your ears (although I can certainly understand calling it magical or mystical in a purely romantic, non-literal sense).


"But I am not an object. I am not a noun, I am an adjective. I am the way matter behaves when it is organized in a John K Clark-ish way. At the present time only one chunk of matter in the universe behaves that way; someday that could change."
-- John K Clark

from Eliezer Yudkowsky at OvercomingBias.com


If I saved/uploaded a copy of my complete mental self, it wouldn't change a thing for me. I cannot stress this enough. My copy basically becomes a new person over time. It's just another person wondering the Earth, not too unlike a twin, which are nearly exact clones at birth, but then become different people over time as they grow up. There is no continued life for me if I'm slammed by a bus and my copy continues. And frankly, that's all I care about -- my continued existence, not a copy's.


I, me, my

"But I am not an object. I am not a noun, I am an adjective. I am the way matter behaves when it is organized in a John K Clark-ish way. At the present time only one chunk of matter in the universe behaves that way; someday that could change."
-- John K Clark

although I can't say I don't share your "yuck" reaction, this is a fundamental philosophical hurdle.

suppose your full brain state was backed up upon each molecular operation. when you get hit by a bus and revived, how would that not be you?

see below about zombies before posting an objection, if you have one.. otherwise, do you agree?


As for the zombies, see this: http://www.overcomin...04/zombies.html

I think this is your particular fundamental philosophical hurdle.


Turns out this is a settled issue.


Edited by advancdaltruist, 27 February 2009 - 01:03 AM.


#18 100YearsToGo

  • Guest
  • 204 posts
  • 1
  • Location:Netherlands Antilles

Posted 27 February 2009 - 01:22 AM

It is a bit difficult to describe the two philosophies I'm trying to differentiate here. I think the best way to describe them is with an example...

So let's say your consciousness is being recorded constantly onto a "backup" of sorts, so that at every instant the "backup" matches your exact consciousness. Now let's say you are killed somehow so that your original consciousness is completely destroyed. Now, your "backup" was not destroyed, and at the exact moment of your death your "backup" is booted up and starts running. Here's the conflict...

Philosophy A:

You are dead. You not longer hear, taste, smell, see, or feel anything. The "backup" of you is just a copy and nothing more. You will still be dead despite there being an exact copy of your consciousness existing.

Philosophy B:

You are still alive. You still hear, taste, smell, see and feel anything that you normally would. You exist as your "backup" just as you did before. You will be alive as long as you still have a "backup" to boot up when you die.


I personally strongly believe in Philosophy A. It seems that people have strong opinions no matter which side they are on. I'm curious to see how many are on each side. I'll say more about what I think as the thread grows.

So which do you believe? Philosophy A or philosophy B?

(cross-posting)

To use Eliezer's thought exercise... imagine replacing one neuron at a time (or if you are really picky, use a lower scale) with a totally functionally equivalent non-biological component (doing so with the understanding that you replace all the atoms and molecules in your body constantly, so nothing fundamentally different or special is occuring). If you imagine doing it over a long period of time, you would continue thinking, remembering, and behaving as your usual conscious self after any given neuron is replaced. Use the process of induction here- once you have replaced all of the neurons, you are still exactly the same, yet totally digital. Now compress the time component so that everything happens in one instant. There you have it- you have uploaded from a biological to a computational substrate, without losing anything whatsoever.

So taking your argument to the logical conclusion, you would be saying that anytime we lose a particular atom or molecule that we *totally die*, because to replace just one atom with a different one would be a "copy", and thus not technically 'you'. There is your absurdity my friend.

Believe me- there is nothing supernatural about that lump of carbon between your ears (although I can certainly understand calling it magical or mystical in a purely romantic, non-literal sense).


"But I am not an object. I am not a noun, I am an adjective. I am the way matter behaves when it is organized in a John K Clark-ish way. At the present time only one chunk of matter in the universe behaves that way; someday that could change."
-- John K Clark

from Eliezer Yudkowsky at OvercomingBias.com


If I saved/uploaded a copy of my complete mental self, it wouldn't change a thing for me. I cannot stress this enough. My copy basically becomes a new person over time. It's just another person wondering the Earth, not too unlike a twin, which are nearly exact clones at birth, but then become different people over time as they grow up. There is no continued life for me if I'm slammed by a bus and my copy continues. And frankly, that's all I care about -- my continued existence, not a copy's.


I, me, my

"But I am not an object. I am not a noun, I am an adjective. I am the way matter behaves when it is organized in a John K Clark-ish way. At the present time only one chunk of matter in the universe behaves that way; someday that could change."
-- John K Clark

although I can't say I don't share your "yuck" reaction, this is a fundamental philosophical hurdle.

suppose your full brain state was backed up upon each molecular operation. when you get hit by a bus and revived, how would that not be you?

see below about zombies before posting an objection, if you have one.. otherwise, do you agree?


As for the zombies, see this: http://www.overcomin...04/zombies.html

I think this is your particular fundamental philosophical hurdle.


Turns out this is a settled issue.




Eliezer's thought exercise is not the same as making a backup copy (what vgamer1 envisioned). It is more like metamorfosis. The gradual changing of one creature into another.In this case the hardware changes, but conciousness remain the same. Even if it happens in one split second it is still metamorfosis. In THAT case I would agree that you did not die you just changed a bit.

#19 RighteousReason

  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 27 February 2009 - 01:36 AM

I can't do it. I just can't argue with stupid people anymore. Sorry stupid people. Good luck.

#20 100YearsToGo

  • Guest
  • 204 posts
  • 1
  • Location:Netherlands Antilles

Posted 27 February 2009 - 01:40 AM

I can't do it. I just can't argue with stupid people anymore. Sorry stupid people. Good luck.



Thats a good escape route. Bye

#21 Vgamer1

  • Topic Starter
  • Guest, F@H
  • 763 posts
  • 39
  • Location:Los Angeles

Posted 27 February 2009 - 03:17 AM

I can't do it. I just can't argue with stupid people anymore. Sorry stupid people. Good luck.


lol?

#22 Vgamer1

  • Topic Starter
  • Guest, F@H
  • 763 posts
  • 39
  • Location:Los Angeles

Posted 27 February 2009 - 03:21 AM

As an actual response to advancdaltruist, the problem with the thought experiment is the idea of replacing the neurons "in one instant." That is what would kill you. If it is done one at a time or at some "slower" speed in which you remain conscious the entire time then you would survive. If it's done all at once that would mean you die and are replaced by a copy.

Just because there was a thread before about this doesn't mean the issue is settled...

Edited by Vgamer1, 27 February 2009 - 03:23 AM.


⌛⇒ MITOMOUSE has been fully funded!

#23 valkyrie_ice

  • Guest
  • 837 posts
  • 142
  • Location:Monteagle, TN

Posted 27 February 2009 - 03:58 AM

As an actual response to advancdaltruist, the problem with the thought experiment is the idea of replacing the neurons "in one instant." That is what would kill you. If it is done one at a time or at some "slower" speed in which you remain conscious the entire time then you would survive. If it's done all at once that would mean you die and are replaced by a copy.

Just because there was a thread before about this doesn't mean the issue is settled...



The fallacy I see with this arguement is that our brains do not experience consciousness at all times. It functions, yes, but during sleep there are long periods between rem states were our consciousness is dormant. Just as there are occassions where my consciouness has been deliberately interrupted by anethesia. I was not conscious when my appedix was removed, and my brain retains no memories of the experience. Nor would I wish to retain the experience of having my body cut open, my nerves screaming in pain while an internal organ was cut out of my body and the massive infection drained.

As consciousness is able to continue over such gaps, I can not see a fundimental difference between replacing my neurons one at a time while I am completely conscious, or an instantanious replacement, just as I cannot see a difference between two copies of me exchanging memories at intervals versus two copies exchanging memories instantaniously via some sort of constant linkage. The me who wakes up every day is the me who went to sleep the night before, just as the me who woke up from surgery was the same me who went under anesthesia. Again and again, it seems to me the argument comes down to an inherent "specialness" being assigned to the hardware.

To make another computer analogy for the sake of attempting to clarify concepts let's say you had a computer. This computer is the generic type, with all the basic functions. You turn it on, it works and as time passes you fill up the harddrive with all your files, customize the OS and make it suit your needs. Now, you replace the video card, using a new video card that is in all respects identical to the original. Then the networkcard, then one part at a time you replace every single part of this computer until the only peice of equipment in it that is the same as the computer you orginally bought is the harddrive with all your personal data, and every single peice is identicle to the parts you replaced (same manufacturer, same models, etc). Then you decide to replace the drive, so you aquire a new drive, and hook it up and manually copy every single file from the first drive into the second, byte by byte, so that the new drive has all your data, all your customizations and every single thing you have put on the old drive, so that when you are done, the computer is in every respect the exact same computer it was with entirely new parts but nothing else about the computer has changed. It's functions are exactly the same as the previous computer, the specs are unchanged, and all the data is identical. Just for forms sake, lets say you even put a scratch on the case to match the one that existed on the previous one. How in any real sense would someone who came to your house and looked at the computer tell that the computer sitting on your desk is any different than the one that sat there all the previous times they came?

And how, in essence, is this any different that sitting a new identical computer next to your old one and simply copying the hard drive over before disposing of the previous one?

#24 100YearsToGo

  • Guest
  • 204 posts
  • 1
  • Location:Netherlands Antilles

Posted 27 February 2009 - 02:51 PM

As an actual response to advancdaltruist, the problem with the thought experiment is the idea of replacing the neurons "in one instant." That is what would kill you. If it is done one at a time or at some "slower" speed in which you remain conscious the entire time then you would survive. If it's done all at once that would mean you die and are replaced by a copy.

Just because there was a thread before about this doesn't mean the issue is settled...



The fallacy I see with this arguement is that our brains do not experience consciousness at all times. It functions, yes, but during sleep there are long periods between rem states were our consciousness is dormant. Just as there are occassions where my consciouness has been deliberately interrupted by anethesia. I was not conscious when my appedix was removed, and my brain retains no memories of the experience. Nor would I wish to retain the experience of having my body cut open, my nerves screaming in pain while an internal organ was cut out of my body and the massive infection drained.

As consciousness is able to continue over such gaps, I can not see a fundimental difference between replacing my neurons one at a time while I am completely conscious, or an instantanious replacement, just as I cannot see a difference between two copies of me exchanging memories at intervals versus two copies exchanging memories instantaniously via some sort of constant linkage. The me who wakes up every day is the me who went to sleep the night before, just as the me who woke up from surgery was the same me who went under anesthesia. Again and again, it seems to me the argument comes down to an inherent "specialness" being assigned to the hardware.

To make another computer analogy for the sake of attempting to clarify concepts let's say you had a computer. This computer is the generic type, with all the basic functions. You turn it on, it works and as time passes you fill up the harddrive with all your files, customize the OS and make it suit your needs. Now, you replace the video card, using a new video card that is in all respects identical to the original. Then the networkcard, then one part at a time you replace every single part of this computer until the only peice of equipment in it that is the same as the computer you orginally bought is the harddrive with all your personal data, and every single peice is identicle to the parts you replaced (same manufacturer, same models, etc). Then you decide to replace the drive, so you aquire a new drive, and hook it up and manually copy every single file from the first drive into the second, byte by byte, so that the new drive has all your data, all your customizations and every single thing you have put on the old drive, so that when you are done, the computer is in every respect the exact same computer it was with entirely new parts but nothing else about the computer has changed. It's functions are exactly the same as the previous computer, the specs are unchanged, and all the data is identical. Just for forms sake, lets say you even put a scratch on the case to match the one that existed on the previous one. How in any real sense would someone who came to your house and looked at the computer tell that the computer sitting on your desk is any different than the one that sat there all the previous times they came?

And how, in essence, is this any different that sitting a new identical computer next to your old one and simply copying the hard drive over before disposing of the previous one?


C'mon valkyrie...no one would know the difference between the 2 computers. Only the old computer would know the difference when you send it to the scrap yard. I don't want to be that old computer! ha ha.

I'm sure you realize that 2 molecules of the same identical chemical makeup are not the same entity? I understand that saving the code is important. But if thats the only thing you care about, why not cut out some cells, ..treat them with trehalose, freeze it and conserve your DNA? you would conserve all of your code. After 100 years when the economy gets better they would reconstitute you. Of course you would not have past memory...but who cares about memory anyway! Your brain has limit storage capacity so you must forget somethings anyway if you are immortal. And I'm sure you wouldn't care to remember something of about a million years ago.

BTW these guys talking about uploading? Will they upload your limbic system too? the one that is charge of falling in love, or getting an erection, or making your heart beat or enjoying a beautifull morning or to fight for our survival when threatened? Or will they cut of those inputs? If you are all digital those chemical inputs will simply not be there. Will we still be us? How would you fall in love with other bits and bytes? What about fatherly or motherly love? Mostly when we talk about uploading and stuff we think of memory and intelligence...There are other things worth preserving in my opinion.

#25 Vgamer1

  • Topic Starter
  • Guest, F@H
  • 763 posts
  • 39
  • Location:Los Angeles

Posted 27 February 2009 - 10:15 PM

I'm curious 100Years... Do you believe uploading is possible? I'm skeptical, but I think maybe in some sense it would be. For example, if we slowly replaced our cells and neurons with synthetic ones over time so that we are essentially "running" on new hardware, would that qualify as uploading? Maybe not in a technical sense, but I think that would be good enough for me. I'd be happy to be in a body that was pretty similar to mine, except the components are much stronger and will last much longer. Also, I'd be able to stay in a "physical" form instead of just existing on a hard drive somewhere floating in cyberspace.

The only way I would see true uploading as a possibility would be to copy your memories and such onto a computer program, then somehow linking the two systems through a network of some kind so that you would be conscious of both entities, and then destroying the original. But even that sounds dangerous if not done properly. I still think I'd prefer staying in a "physical" form, like a cyborg basically.

What do you think?

Edited by Vgamer1, 27 February 2009 - 10:16 PM.


#26 100YearsToGo

  • Guest
  • 204 posts
  • 1
  • Location:Netherlands Antilles

Posted 28 February 2009 - 12:15 AM

I'm curious 100Years... Do you believe uploading is possible? I'm skeptical, but I think maybe in some sense it would be. For example, if we slowly replaced our cells and neurons with synthetic ones over time so that we are essentially "running" on new hardware, would that qualify as uploading? Maybe not in a technical sense, but I think that would be good enough for me. I'd be happy to be in a body that was pretty similar to mine, except the components are much stronger and will last much longer. Also, I'd be able to stay in a "physical" form instead of just existing on a hard drive somewhere floating in cyberspace.

The only way I would see true uploading as a possibility would be to copy your memories and such onto a computer program, then somehow linking the two systems through a network of some kind so that you would be conscious of both entities, and then destroying the original. But even that sounds dangerous if not done properly. I still think I'd prefer staying in a "physical" form, like a cyborg basically.

What do you think?


I don't know. We are far from having a complete understanding of how the brain functions and how consciousness arises. There are researchers like Penrose and Evan Harris Walker connecting consciousness with quantum mechanics. If the mind functions according to classical mechanics, it is uploadable. If not, I doubt it can be uploaded. This is because no system outside of your brain can duplicate the quantum fluctuations that exists in your brain. Your quantum fluctaions are you! Even if you could make copies of your quantum fluctuations (a thing that seems impossible to me) it would not be you.

http://en.wikipedia....iki/Henry_Stapp

#27 Vgamer1

  • Topic Starter
  • Guest, F@H
  • 763 posts
  • 39
  • Location:Los Angeles

Posted 28 February 2009 - 12:39 AM

I don't know. We are far from having a complete understanding of how the brain functions and how consciousness arises. There are researchers like Penrose and Evan Harris Walker connecting consciousness with quantum mechanics. If the mind functions according to classical mechanics, it is uploadable. If not, I doubt it can be uploaded. This is because no system outside of your brain can duplicate the quantum fluctuations that exists in your brain. Your quantum fluctaions are you! Even if you could make copies of your quantum fluctuations (a thing that seems impossible to me) it would not be you.

http://en.wikipedia....iki/Henry_Stapp


But what about replacing neurons one by one with synthetic ones? Surely you would remain conscious during the process and then your consciousness would exist on a new substrate. That sounds possible to me.

#28 100YearsToGo

  • Guest
  • 204 posts
  • 1
  • Location:Netherlands Antilles

Posted 28 February 2009 - 01:54 AM

I don't know. We are far from having a complete understanding of how the brain functions and how consciousness arises. There are researchers like Penrose and Evan Harris Walker connecting consciousness with quantum mechanics. If the mind functions according to classical mechanics, it is uploadable. If not, I doubt it can be uploaded. This is because no system outside of your brain can duplicate the quantum fluctuations that exists in your brain. Your quantum fluctaions are you! Even if you could make copies of your quantum fluctuations (a thing that seems impossible to me) it would not be you.

http://en.wikipedia....iki/Henry_Stapp


But what about replacing neurons one by one with synthetic ones? Surely you would remain conscious during the process and then your consciousness would exist on a new substrate. That sounds possible to me.


If the brain functions according to classical mechanics, yes you can do it.

However if consiousness is a quantum effect, its state can not be determined. There is no algorithm possible to determine quantum wave function colapse for instance. It follows that conciousness would be non algorithmic. There would be no mechanical system able to duplicate it. No equivalent logical program. You would not be able to replace bit by bit like parts in the engine of a car. The mere act of looking at a bunch of quantum fluctations would colapse them. How would you built even one artificial neuron to replace a certain quantum state that you can not even measure?

#29 Infernity

  • Guest
  • 3,322 posts
  • 11
  • Location:Israel (originally from Amsterdam, Holland)

Posted 28 February 2009 - 08:41 AM

A.

After YOU die, it doesn't matter how many copies you had and how similar they are -YOU are gone, no longer "are": Nothing is relevant then. Your consciousness is GONE. The other similarily might think and be the same, but it's not It.

sponsored ad

  • Advert
Advertisements help to support the work of this non-profit organisation. [] To go ad-free join as a Member.

#30 valkyrie_ice

  • Guest
  • 837 posts
  • 142
  • Location:Monteagle, TN

Posted 28 February 2009 - 09:20 AM

*Sigh* I see this is coming down to a battle of belief systems, rather than a discussion, so I will agree to disagree for now. I have stated my reasons for my opinions, and the logic behind my opinions. At present we lack sufficient knowledge to make more than educated guesses as to the possibilities, so further arguement is unlikely to achieve significant results. Only time will tell.

View that as you will.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users