• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
- - - - -

'Living forever as a machine...'


  • Please log in to reply
52 replies to this topic

#1 AdamSummerfield

  • Guest
  • 351 posts
  • 4
  • Location:Derbyshire, England

Posted 20 October 2007 - 04:36 PM


There is something I don't understand, however I don't expect it to be difficult for someone here to explain.

I have heard many claims that one day we shall download our personalities into computers and 'live forever as machines' - or as it is said in 'Essays on Infinite Lifespans' : "... it should be possible to transfer a human personality into a robot, thereby extending the person's lifetime by the durability of the machine."

I see a problem with this strategy. If you download everything about your brain into a machine, then die - you're dead. You are pronounced medically dead due to the lack of neural waves present in your brain. Once the machine is switched on - you're still dead. The machine just insists that it is you, it has all your memories and so on.

The purpose in some cases was to argue that we can achieve a form of immortality in that something with our memories and personality - that inextricably believes that it is the person that died - still has consciousness and is still 'alive'. But in other cases the point was undoubtedly to argue that by downloading our minds on to computers that we can live beyond the grave as computers.

What is it that I have missed in this concept?

- Sezarus

#2 Athan

  • Guest
  • 156 posts
  • 0

Posted 20 October 2007 - 06:12 PM

I have the exact same issue with this speculative technique.

It is my memories, my personality, my pattern - but I still know it is not me. What's the problem with this reasoning?

sponsored ad

  • Advert

#3 jaydfox

  • Guest
  • 6,214 posts
  • 1
  • Location:Atlanta, Georgia

Posted 20 October 2007 - 06:17 PM

You'll get mixed answers from this forum. There are plenty here who have no problem with this scenario, and plenty who do.

Personally, I'm with you. If I upload a copy of myself into a computer, it may as well be a completely different person. Regardless of whether that soft-copy of me thinks and acts as though it feels just like me, it may as well be a copy of you. It ain't me. So when I pull the plug on the original hard-copy of me, that's it! I'm dead! But good luck to my soft-copy, I wish it the best in...simlife.

But seriously, there are very good arguments on the other side for why it doesn't matter, why you shouldn't be worried. Brian Wowk was one of the best at articulating them, but I haven't been around much in the last year, so I don't know whether he's around much either. I don't agree with his position, but I understand it reasonably well and I can't entirely counter it.

#4 JonesGuy

  • Guest
  • 1,183 posts
  • 8

Posted 20 October 2007 - 06:18 PM

It depends on how the personality is transferred.

Remember that 'you' are not just a portion of your brain. If you have a stroke in some part of your brain, that region will die. But 'you' will still be alive. Heck, portions of our brain degrade everyday - yet we're still us.

So what might happen is that one's consciousness would spread and expand into the silicon. The whole time, the gestalt would be 'you'. If a portion died, you'd still be you - still

#5 jaydfox

  • Guest
  • 6,214 posts
  • 1
  • Location:Atlanta, Georgia

Posted 20 October 2007 - 06:24 PM

A similar and thornier question is what happens if I make a new hard-copy of myself. In other words, a physically identical copy of my body, brain and all. What then?

Here, the materialists are even harder to refute. Short of some quantum mechanical or other bizarre process or physical quality, it's hard to refute the claim that your consciousness would continue uninterrupted in the copy, even if the copy were "instantiated" at a later time, perhaps even after the death of the original you.

Here, we leave the realm of physics (and don't let a materialist tell you that you're cheating) and enter the realm of metaphysics and philosophy. Not that physics doesn't have its uses, but ultimately, physics doesn't describe mathematics, and physics doesn't dictate metaphysics either. At best it constrains it. But I reserve the right to be wrong. ;)

#6 AdamSummerfield

  • Topic Starter
  • Guest
  • 351 posts
  • 4
  • Location:Derbyshire, England

Posted 20 October 2007 - 06:53 PM

If I began replacing my brain with non-biological circuitry, at first I would certainly remain, although simultaneously reaping the benefits of augmented consciousness. I am wondering if once I had finished replacing my brain, bit by bit, with 'computer parts' whether or not I would remain the same.

#7 John Schloendorn

  • Guest, Advisor, Guardian
  • 2,542 posts
  • 157
  • Location:Mountain View, CA

Posted 20 October 2007 - 07:00 PM

What is it that I have missed in this concept?

That this has already happened -- your body is the machine!

#8 Futurist1000

  • Guest
  • 438 posts
  • 1
  • Location:U.S.A.

Posted 20 October 2007 - 07:11 PM

If you replace your real neurons with "artifical neurons" (slowly, maybe doing it one neuron at a time through nanotechnology), your brain could be totally replaced while your stream of conciousness could be maintained. I think that's the standard line of reasoning given. Even if its transferred to a computer there may be a way of gradually doing it so your conciousness never ceases subjectively.

#9 basho

  • Guest
  • 774 posts
  • 1
  • Location:oʎʞoʇ

Posted 20 October 2007 - 10:10 PM

If I began replacing my brain with non-biological circuitry, at first I would certainly remain, although simultaneously reaping the benefits of augmented consciousness. I am wondering if once I had finished replacing my brain, bit by bit, with 'computer parts' whether or not I would remain the same.

Its an old and interesting philosophical question. Given that the cells and atoms that make up our bodies are continually being replaced, this question applies to all of us, even if you don't replace biological neurons with non-biological circuitry. Continuity of personal identity is maintained despite the fact that the physical composition of our body is constantly in flux.

#10 electric buddha

  • Guest
  • 76 posts
  • 0
  • Location:Helena,MT

Posted 20 October 2007 - 10:22 PM

What is it that I have missed in this concept?


The illusion of a cartesian theater, soul, or homunculous sitting in the brain which is looking out into the world through your eyes. It's a difficult concept to get over because it seems so intuitively set, but once you get into a mindset where there can be instances of being x% of you, the concept of such a transfer becomes more sound.

#11 Matt

  • Guest
  • 2,862 posts
  • 149
  • Location:United Kingdom
  • NO

Posted 20 October 2007 - 10:29 PM

I've been thinking about this and I am pretty sure I know how it will work ;)

I don't think it will be the kind of upload to machine that some people may be thinking about. My take on all this will be a slow transition from biological to artificial over time by 'replacing' and/or augmenting the brain. In this scenario our consciousness would never cease, and eventually our personality, old memories/new memories will be transferred or built up onto non biological formats and we'll continue to live. Initially with both biological and artificial with eventually going to totally non biological. So as new memories are formed your personality and 'you' are being recorded onto these computer chips.

I don't think it would be, upload to computer, and then if I die it doesn't matter because I continue on that computer. But we all know this is not the case, the copy will just be a copy. It might not matter to others as I, or the copy rather, will basically seem like me to my friends, but i'll be dead eventually. Transferring consciousness by uploading to another format? No I don't think so, I think it will be a slow transition over time by augmentation of our brain.

#12 william7

  • Guest
  • 1,777 posts
  • 17
  • Location:US

Posted 14 November 2007 - 02:55 AM

I have heard many claims that one day we shall download our personalities into computers and 'live forever as machines'

This might be the ultimate insult to humanity - turning man into a machine. Checkout what Erich Fromm says in his book The Revolution of Hope: Toward a Humanized Technology (1968):

Some anthropologists and other observers of man have be­lieved that man is infinitely malleable. At first glance, this seems to be so. Just as he can eat meat or vegetables or both, he can live as a slave and as a free man, in scarcity or abundance, in a society which values love and one which values destruction. Indeed, man can do almost anything, or, perhaps better, the social order can do almost anything to man. The "almost" is important. Even if the social order can do everything to man— starve him, torture him, imprison him, or overfeed him—this cannot be done without certain consequences which follow from the very conditions of human existence. Man, if utterly deprived of all stimuli and pleasure, will be incapable of per­forming work, certainly any skilled work. If he is not that utterly destitute, he will tend to rebel if you make him a slave; he will tend to be violent if life is too boring; he will tend to lose all creativity if you make him into a machine.

p.61

#13 Live Forever

  • Guest Recorder
  • 7,475 posts
  • 9
  • Location:Atlanta, GA USA

Posted 14 November 2007 - 03:06 AM

This might be the ultimate insult to humanity - turning man into a machine.

We are already a machine, just a biological one as opposed to a mechanical one.

#14 william7

  • Guest
  • 1,777 posts
  • 17
  • Location:US

Posted 14 November 2007 - 03:17 AM

This might be the ultimate insult to humanity - turning man into a machine.

We are already a machine, just a biological one as opposed to a mechanical one.

There's a big difference, however, between a biological machine and a man made mechanical one. Fairly significant wouldn't you agree?

Here's another Erich Fromm quote from the same book. I know you're going to love it.

A specter is stalking in our midst whom only a few see with clarity. It is not the old ghost of communism or fascism. It is a new specter: a completely mechanized society, devoted to maxi­mal material output and consumption, directed by computers; and in this social process, man himself is being transformed into a part of the total machine, well fed and entertained, yet passive, unalive, and with little feeling.

Makes you think about body slamming your computer doesn't it?

#15 niner

  • Guest
  • 16,276 posts
  • 1,999
  • Location:Philadelphia

Posted 14 November 2007 - 03:19 AM

Sezarus, I'm glad you asked that question, because the answers here have been pretty good. I never really gave this much thought, but I can now imagine a situation where given an appropriate neural interface and a nice new computer to live in, your mind would kind of have a new "room" to inhabit. Experientially, it might even feel like you can shift back and forth, although my money would be on a more unified experience. At any rate, yeah, I could see a situation where after a sufficient amount of time for knowledge and memory transfer, you could essentially unplug the wetware part of yourself, and never miss it. This week's New Scientist had a review of a book about sex robots. Even though the reviewer panned the concept, I can see that If we manage to stick around for about a hundred years, things are going to get really interesting. And weird. (Which tends to be interesting in itself.)

#16 Live Forever

  • Guest Recorder
  • 7,475 posts
  • 9
  • Location:Atlanta, GA USA

Posted 14 November 2007 - 03:36 AM

This might be the ultimate insult to humanity - turning man into a machine.

We are already a machine, just a biological one as opposed to a mechanical one.

There's a big difference, however, between a biological machine and a man made mechanical one. Fairly significant wouldn't you agree?

There is a difference, but it isn't a big one. Other than being way more complex (something that will reverse itself at some point in the future), biological machines are not that different than mechanical ones. If a mechanical machine could do what the biological one does, but better, then I certainly don't see a problem with making the change. Do you fault people for having pacemakers or mechanical limbs (arms and legs), or any of the hundreds of machines that already interact directly with our physiology? I see them as life savers and things that make life better for people. If that could be improved, then I am all for it.


Makes you think about body slamming your computer doesn't it?

Certainly not. Computers do not have prejudices. Computers are not racist or sexist. Computers do not want to push their religious or moral ideologies on someone else. They do what they are designed to do more efficiently than humans could ever do. Period. I hope that they can take over larger sectors of things that are currently being done by humans in the future.

#17 Shannon Vyff

  • Life Member, Director Lead Moderator
  • 3,897 posts
  • 702
  • Location:Boston, MA

Posted 14 November 2007 - 04:14 AM

I'm on the side, that feels that their body already is a machine, and I would not mind an upgraded less fragile, more efficient machine.

#18 Grail

  • Guest, F@H
  • 252 posts
  • 12
  • Location:Australia

Posted 14 November 2007 - 04:36 AM

This brings up the interesting "what makes us human and is this something worth holding onto?" question.
Personally, I can see positives and negatives to become "less fragile and more efficient".
This song by Scott Matthew written for the Ghost in the Shell: Stand Alone Complex anime evokes some interesting emotions when thinking about what we may lose without realising it.

"I analyze and I verify and I quantify enough
100 percentile no errors no miss
I synchronize and I specialize and I classify so much
Don't worry 'bout dreaming because I don't sleep --

I wish I could at least 30 percent
Maybe 50 for pleasure then skip all the rest

If I only was more human
I would count every single second the rest of my life
If I just could be more human
I'd have so many little babies and maybe a wife

I'd roll around in mud and have lots of fun then when I was done
Build bubblebath towers and swim in the tub
Sand Castles on the beach, frolick in the sea, get a broken knee
Be scared of the dark and I'd sing out of key

Curse when I lost a fight, kiss and reunite, scratch a spider's bite
Be happy with wrinkles I got when I smile
Pet kittens 'till they purred, maybe keep a bird, always keep my word
I'd cry at sad movies and laugh 'till it hurt

I'd buy a big bike, I'd ride by the lake
And I'd have lots of friends and I'd stay out too late

If I could just be more human
I would see every little thing with a gleam in my eye
If only I was more human
I'd embrace every single feeling that came in my life

Would I care and be forgiving?
Would I be sentimental and would I feel loneliness?

Would I doubt and have misgivings?
Would I cause someone sorrow too? Would I know what to do?

Will I cry when its all over?

When I die will I see Heaven?"


Of course, as an immortalist I don't agree with everything in it...but it makes you think just the same.

The original Ghost in the Shell movie and the sequal explore this theme to some extent, and are highly regarded.

As for the topic of the thread...I think that Matt has the right idea. This has all been said before though.

#19 niner

  • Guest
  • 16,276 posts
  • 1,999
  • Location:Philadelphia

Posted 14 November 2007 - 04:47 AM

Nice song, but you could do all that human stuff with, say, a prosthesis. Living in a machine would just be more prostheses, but it doesn't mean that you wouldn't feel, or have emotion. People who are sick, decrepit, and suffering probably don't see every little thing with a gleam in their eye, but maybe they would again with an artificial body.

#20 william7

  • Guest
  • 1,777 posts
  • 17
  • Location:US

Posted 14 November 2007 - 11:36 AM

I hope that they can take over larger sectors of things that are currently being done by humans in the future.

I just hope they don't become a cancerlike growth as Erich Fromm mentions in this quote below:

But in contrast to those who, like the previously mentioned authors, recognize the specter with either sympathy or horror are the majority of men, those at the top of the establishment and the average citizen, who do not see a specter. They have the old-fashioned belief of the nineteenth century that the machine will help lighten man's burden, that it will remain a means to an end, and they do not see the danger that if technology is permitted to follow its own logic, it will become a cancerlike growth, eventually threatening the structured system of individual and social life.



#21 william7

  • Guest
  • 1,777 posts
  • 17
  • Location:US

Posted 14 November 2007 - 11:38 AM

Personally, I can see positives and negatives to become "less fragile and more efficient".

Me too. I think we better be very careful about turning ourselves into machines.

#22 Live Forever

  • Guest Recorder
  • 7,475 posts
  • 9
  • Location:Atlanta, GA USA

Posted 14 November 2007 - 05:33 PM

I hope that they can take over larger sectors of things that are currently being done by humans in the future.

I just hope they don't become a cancerlike growth as Erich Fromm mentions in this quote below:

But in contrast to those who, like the previously mentioned authors, recognize the specter with either sympathy or horror are the majority of men, those at the top of the establishment and the average citizen, who do not see a specter. They have the old-fashioned belief of the nineteenth century that the machine will help lighten man's burden, that it will remain a means to an end, and they do not see the danger that if technology is permitted to follow its own logic, it will become a cancerlike growth, eventually threatening the structured system of individual and social life.

"Cancerlike" eh? Seems like a loaded term to me. I do not know what properties of cancer that he wishes it not to have. Certainly the only way that computers will be adopted to do more things is if they are more efficient than humans. (unlike cancer cells) The growth of computing has been quite fast, but not nearly as fast as cancer. (although I suppose you could make a parallel with differing timescales and equaling the same rates among them, I haven't done the math) Certainly, computers will not eventually kill society, but instead make improvements to society, which is much different than cancer. (which kills the organism)

No, I think "cancerlike" is a very poor choice of a word to use in this instance.

#23 Live Forever

  • Guest Recorder
  • 7,475 posts
  • 9
  • Location:Atlanta, GA USA

Posted 14 November 2007 - 05:34 PM

Personally, I can see positives and negatives to become "less fragile and more efficient".

Me too. I think we better be very careful about turning ourselves into machines.

I don't think that anyone (even the proponents of improvements) would say we should not be careful about it. Being cautious and making sure everything is safe every step of the way is paramount. (to me anyway)

#24 forever freedom

  • Guest
  • 2,362 posts
  • 67

Posted 14 November 2007 - 08:57 PM

Considering that we are constantly becoming "copies" of ourselves (the atoms and our cells are constantly being replaced), i dont see much problem in turning ourselves into machines as long as the process is done safely.

My guess is that by going step by step and replacing small biological parts of us by non biological parts, we will still be "us" in the end.

#25 RighteousReason

  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 14 November 2007 - 10:44 PM

I see a problem with this strategy. If you download everything about your brain into a machine, then die - you're dead. You are pronounced medically dead due to the lack of neural waves present in your brain. Once the machine is switched on - you're still dead. The machine just insists that it is you, it has all your memories and so on.


If you want to look at it that way, then you die every night when you go to sleep.

#26 RighteousReason

  • Guest
  • 2,491 posts
  • -103
  • Location:Atlanta, GA

Posted 14 November 2007 - 10:45 PM

Permutation City by Greg Egan

#27 Grail

  • Guest, F@H
  • 252 posts
  • 12
  • Location:Australia

Posted 15 November 2007 - 12:16 AM

I have nothing against the extensive use of computers in our society, I just hope we don't become like them. Emotions for example are extremely inefficient. What are we without emotions? Seems to me that we have to actively be aware of retaining our humanity.

As for dying when you go to sleep...your brain is constantly working when you go to sleep so I don't think that is a problem. If all neural activity ceases, then you are dead. Even with cryonics, the aim is that neural activity is simply slowed down to prevent decay, not halted altogether.

#28 Live Forever

  • Guest Recorder
  • 7,475 posts
  • 9
  • Location:Atlanta, GA USA

Posted 15 November 2007 - 12:56 AM

I have nothing against the extensive use of computers in our society, I just hope we don't become like them. Emotions for example are extremely inefficient. What are we without emotions? Seems to me that we have to actively be aware of retaining our humanity.

Who says you would lose your emotions?

#29 william7

  • Guest
  • 1,777 posts
  • 17
  • Location:US

Posted 15 November 2007 - 01:34 AM

I have nothing against the extensive use of computers in our society

As long as they're used to support a peaceful, communal society devoted to a long, healthy life of serving God, I believe computers could be very useful. I'm opposed to the use of computers, however, when they're used for destructive purposes as they're now being used. Checkout Erich Fromm's excellent description of the problem below:

How did it happen? How did man, at the very height of his victory over nature, become the prisoner of his own creation and in serious danger of destroying himself?
In the search for scientific truth, man came across knowledge that he could use for the domination of nature. He had tre­mendous success. But in the one-sided emphasis on technique and material consumption, man lost touch with himself, with life. Having lost religious faith and the humanistic values bound up with it, he concentrated on technical and material values and lost the capacity for deep emotional experiences, for the joy and sadness that accompany them. The machine he built became so powerful that it developed its own program, which now determines man's own thinking.
At the moment, one of the gravest symptoms of our system is the fact that our economy rests upon arms production (plus maintenance of the whole defense establishment) and on the principle of maximal consumption. We have a well-func­tioning economic system under the condition that we are pro­ducing goods which threaten us with physical destruction, that we transform the individual into a total passive consumer and thus deaden him, and that we have created a bureaucracy which makes the individual feel impotent.
Are we confronted with a tragic, insolvable dilemma? Must we produce sick people in order to have a healthy economy, or can we use our material resources, our inventions, our com­puters to serve the ends of man? Must individuals be passive and dependent in order to have strong and well-functioning organizations?



sponsored ad

  • Advert

#30 Grail

  • Guest, F@H
  • 252 posts
  • 12
  • Location:Australia

Posted 15 November 2007 - 03:33 AM

I have nothing against the extensive use of computers in our society, I just hope we don't become like them. Emotions for example are extremely inefficient. What are we without emotions? Seems to me that we have to actively be aware of retaining our humanity.

Who says you would lose your emotions?


I think that if the search for efficiency were taken to its conclusion, the loss of emotions would be inevitable. If they were not lost altogether, they would probably be suppressed by the higher thought processes of your mechanically improved brain as they would be perceived as glitches, and barriers to efficient operation of the system as a whole. It is not inevitable that we would reach this conclusion, but all I am saying is that we need to be vigilant on a number of fronts, and perhaps make compromises as far as efficiency, logic and rationality are concerned, in order to maintain our humanity.




1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users