At last you told about the possible technology mentioned above and I think unless gaining vast knowledge about the structure of the brain it won't be achieved which is the same technological progress should be made for the procedure suggested by me .... [thumb]What is likely soon though are brain enhancing technologies and gradual replacement of diseased or disfunctional tissues with neural prosthetics such as artificial sensory input.
Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.

Brain Copy and Paste ....
#91
Posted 16 March 2005 - 10:49 AM
#92
Posted 22 March 2005 - 04:42 PM
Laz,
The reason I emphasise inanimate objects in discussions like these is for the very reason that they cannot support conciousness. Essentially, I am saying "If the answer you propose doesn't work for a brick, it doesn't work for you." It is often difficult to look at things like conciousness objectively, however we have a natural intuition towards real objects. And actually, the paper model is very good for grounding this discussion, to print all this out on a piece of paper is exactly the same as moving someone as bits through a wire, in both cases they are 'existing' in an inanimate object incapable of making them concious. If it cannot support your mind then it cannot reasonably support you being you. The process "you -> not you -> you" doesn't follow logically.
The process you -> not you -> you happens everynight when you go to bed, though, doesn't it?
Earlier you mentioned that I was saying that the copying the hard drive to another computer does not make make that computer the *same* computer. This is true. The physical hardware, the substrate that the computer runs on is not the same. I don't think this analogy really applies, though. To me, the mind isn't the 'processor' or the hardware computer itself, but the 'desktop'. Here is a thought experiment that can demonstrate the truthfulness of this situation.
Proposition A: Say you have your mind recorded to that paper printout of the current state of your brain, as you mentioned before. Immediately following the recording, you suffer a traumatic brain injury in which a large chunk of your brain is damaged. Using the recording as a reference, they repair the damaged section of your brain, returning it to it's healthy state. When you wake up from your accident, you have some of the memories from after the accident but all of them from before. From every angle, you are you, just you repaired.
Proposition B: Now, one step further. Say you are dealt total amnesia in another incident. Using the paper printout of you from the scan, you are once again repaired. You wouldn't remember anything after the accident, but you mind would be in a state of waking up right after your scan.
Proposition C: The same event as proposition A, except instead of repairing the damage, they replace the damaged section of the brain with a electronic version that mimics the inputs and outputs of your recorded section of the brain perfeclty and adapts and changes just like biological tissue would. You are certain still you, as the essential responses to each electrochemical of the brain that make up you every action are still there, and responding as they normally would.
Proposition D: The same as proposition A except instead of repairing your damaged tissue or replacing it with an electronic brain-aiding implant, they clone your body, age the brain to the proper size, then reposition the mass of the healthy brain into the proper molecular configuration to fit in the damaged region of your brain. In this event, the brain material must have been too damaged or removed to repair. The repaired section of your brain functions identically to the previous section fo the brain; the rest of the brain communicates with it and recieves the same signals it would have pre-damage. You are still you when you wake up.
Prospition E: Proposition D but one step further. Immediately following the scan, while preforming routine maintenance on your home-grown large hadron collider, you inadvertently are disintegrated in a cloud of relativistically speeding protons. A clone of you is grown and brought to your age, then using the paper printout, the brain 'repaired' to match the state of you in the scan. This procedure is not intrinstically different from proposition D, except that none of the original brain material is left. As it stands with all of the other instances, it seems the relevant definition of you is the patterns that 'answer back' from each part of your brain as it is queried, not the actual meat substance of the brain. You are, mind-bogglingly, still you.
The primary logical reinforcement of this idea is the shutdown of the brain either induced by barbituates, hypothermia, or merely deep sleep. An alcoholic blackout is the total destruction of your conscious self from the night before, but you remain alive when you wake in the morning. (At least we seem to think so...). After a surgery where your brain is totally shut down with barbituates to a flat-line EEG, you retain no long term memory (like an alcoholic blackout) but are apparently still alive in the morning. The important distinction to face is that the human mind is not the organs of the brain, the different sections of our neural pathways or the electrochemicals travelling down their sodium-potassium cycles. It's the responses, the data transmitted back and forth, the *computation* done by the brain that results in us. Think about it; If the brain shuts entirely down during those a surgery like that, then there is no way consciousness could not be fooled by replacing any one part of the brain with a perfect mimicker of that section. When the brain woke up, it would send signals to that part and it would act the exact same, and the mind would never know the difference. To me, the fact that the brain can be turned off and then turned back on shows that it's not essential what it is running on, only that the computations get done. The signals will match and living will continue. I desperately hope that I am correct, because if you are, then we may be looking at death everytime we turn off our consciousnesses. One of the following must be true:
1) We 'kill ourselves' every single night. The person that is 'resumed' in the morning is just like a computer that has turned off in hibernate mode; the data is recorded and put back together in the morning, but it is not the same consciousness from last night. It is a new birth, that will live until we turn it off again and then start over. In this case, then immortality can only be had by staying active (one stream of consciousness) forever.
2) It's not important where the data is, just that it is computed. A person can be resumed anywhere. One could set the computer down in hibernate mode, record the contents of the ram file and the hard disk onto paper, move to another computer, feed in all those values, and then start the computer up and it would come out of hibernate mode and be right where you left your desktop. This is a more appropriate analogy, I think, then one computer not being the other. Our minds are not the computers, but the program running on them.
3) The third (and what I think most likely) hypothesis is that there is no consciousness at all. It's just an illusion, a useful survival mechanism that makes us think we are watching a movie in front of our eyes. In this case, then worries about killing ourselves by uploading are unfounded; we are constantly 'dying' anyway.
To say anything else is to believe in a soul. The meat is only relevant as far as it preforms the computations neccessary to have a mind. To somehow claim that the brain requires at least 50% of the original brain material in order to stay the same mind (something a lot of people have said to me, in claiming that neural replacement implants could only go so far to help people until they 'die') is dualism, claiming there is something more to our minds then computation, some instrinc factor that can't be captured by science. Personally, I hope #2 is right: That gives us the most portability and allows us to enjoy the lives we hold so dear. Even if #3 is right, at least the illusion is consistent enough to allows us the illusion of enjoying a continuous life. That right there is good enough evidence that uploading won't do any harm.
Even if you believe the upload to be merely a copy, a new person based on the old one, then that begs another interesting question.The reason I feel this is a very important argument is because I personally don't want to see someone killing themself because they believe their copy to be them. In the same way, I don't want someone to perform a quantum suicide experiement to test if they are immortal by the MWI. In both cases, they'd stop existing in "my" universe. To me, you cannot upload yourself by slicing up your brain and reading it into a machine and anyone attempting to do so is killing themself, misguidedly.
What about 'copying' from an uploads perspective? If an upload suddenly made a copy of itself, which would it be (by 'stream of consciousness')? The original? What if both are being fed absolutely identical inputs and having resulsting absolutely identical computations?
By your philosphy, the upload would be whichever one stayed in the same memory location; operating on the same hardware throughout the copy. From this viewpoint it's easy to see that the memory location of the processes running is totally inconseqential. The uplaod could share and move around memory and its computations never be disturbed. If you shuffled the memory location of all the uploads, the invariable answer to which one is the original upload is... all of them. Until a different stimuli is introduced and the computations differ enough, they are all the same person. Nothing would be lost from terminating any number of them, as they are all essentially the same person. Now if he copied himself 5 times and just surrounded himself with his copies, they would start to differntiate pretty quickly. Which one would he end up being? His 'original', would be what you would say, but I would say he has a 1/6 chance of being any 6 of them, just lke all six of them have, as time starts to differentiate them and give them different short-term memory, input, and attention-span computations.
sponsored ad
#93
Posted 30 March 2005 - 10:49 AM
[thumb] [thumb] [thumb]The process you -> not you -> you happens everynight when you go to bed, though, doesn't it?
Earlier you mentioned that I was saying that the copying the hard drive to another computer does not make make that computer the *same* computer. This is true. The physical hardware, the substrate that the computer runs on is not the same. I don't think this analogy really applies, though. To me, the mind isn't the 'processor' or the hardware computer itself, but the 'desktop'. Here is a ....
#94
Posted 30 March 2005 - 04:52 PM
#95
Posted 30 March 2005 - 05:26 PM
The process you -> not you -> you happens everynight when you go to bed, though, doesn't it?
The rest of the argument hinges critically on this premise and it is not a proven claim whatsoever.
#96
Posted 30 March 2005 - 05:45 PM
I don't think I can prove or disprove that, Lazarus... To me, it feels obvious as I am not around when I am sleeping. I can't vouch for others.Before breaking your own arms patting yourselves on the back did you not notice that from the beginning a lot of us simply don't accept this assumption?
The rest of the argument hinges critically on this premise and it is not a proven claim whatsoever.
That is axiom on which my whole argument rests, though, you are correct. I was not declaring victory.
One way or another, there's no way to know who is right without being copied yourself. If I am right, you'll know right away. If I'm wrong, you'd never know... and neither would we, since your copy would think it worked just as much as you would have.
#97
Posted 30 March 2005 - 06:38 PM
BTW, I do not feel like I have lost myself during unconsciousness.
I am curious why you assert that you do?
One way or another, there's no way to know who is right without being copied yourself.
For the moment only there is nothing inherent about the conundrum that implies it cannot be solved, or dependent on being copied. It is really dependent on our ability to understand the mind, not the properties of copying it.
However:
If I'm wrong, you'd never know... and neither would we, since your copy would think it worked just as much as you would have.
In respect to being copied this may be true but I suspect again that a lot of this implied paradox is related to our learning ability more than basic psychology. It is also related to whether or not the end product of the process of copying is a greater than you are capable of now intelligence.
There is an empirical hurdle but there is also a conceptual hurdle to overcome however these cannot be presumed to be impossible to overcome on the basis of what we do not know.
#98
Posted 30 March 2005 - 09:05 PM
BTW, I do not feel like I have lost myself during unconsciousness.
etc...
You are right about me assuming we can't figure it out: I was just dreaming about an easy way of seeing. Overcoming the empirical hurdle might be easier then determining all of the logical properties of minds. But that's no reason to use a tired old excuse.
I don't feel like I lose myself, per se. I don't feel anything at all. Other then dreams, which are few and far between relatively, it just seems like time travel. I lay down and in the morning I'm suddenly waking up. During the period of sleep, I don't feel anything at all, or I am just not paying attention, or something. That's what I mean by losing myself... my body becomes unresponsive, and much different from me when I am awake. The only thing that is left of interest is a bunch of respiring flesh... the mind doesn't seem to be active, and I don't seem to remember anything that happens. Now, the mind is very active during dreaming, I know, but not talking about dreaming right now.
What is the autonomic functions you consider part of the mind? Like eye blinks and heartbeats and such? I could guess that propagated signals from these automatic acts might act as parts of cyclical functions that aide or reinforce the more important aspects of a mind, but overall that seems to be more a chunk of the biology that runs the 'mind', the brain and body. Could you clarify to the set of autonomous functions that are a components of the mind?
#99
Posted 30 March 2005 - 10:15 PM
I am not dependent on being aware of I am I?
Descartes says yes.
Cogito ergo sum
But this assertion also depends upon stimuli and *feed back* even if it is an internal causal form of problem solving like solving a mathematical problem of *choice*. However is it not really just choice that we are talking about?
Did Stephen Hawkins exist before he could communicate?
Do I exist simply as a function of the will to exist?
or
Does dreaming constitute a form of thought albeit less structured and *controlled*?
If I have no awareness of I but an awareness of everything else do I exist?
If I have only an awareness of self but no awareness of anything else do I exist?
Alright enough joking around

What is the autonomic functions you consider part of the mind? Like eye blinks and heartbeats and such? I could guess that propagated signals from these automatic acts might act as parts of cyclical functions that aide or reinforce the more important aspects of a mind, but overall that seems to be more a chunk of the biology that runs the 'mind', the brain and body. Could you clarify to the set of autonomous functions that are a components of the mind?
Actually we can control many aspects of autonomic function but I wasn't talking just about the most obvious. But there is nothing like the lack of these signals to make us aware of how they are a measure of our awareness.
One example is heartbeat. It required adaptation for the first successful recipient of the Jarvik 7 Artificial heart to get used to no longer having a beat and instead having a hum and it interfered with his ability to sleep. [lol]
It is obvious that we are capable of assimilating a whole lot of non verbal data and sensory input beyond the conscious level and this is called *subliminal* or subconscious but if we can perceive without being aware we are perceiving then is it fair to presume that the only valid form of perception that determines the presence of consciousness is when we are overly and obviously aware of stimuli?
Even if the only stimuli are the *mental* tasks we assign ourselves?
#100
Posted 18 April 2005 - 07:28 PM
#101
Posted 19 April 2005 - 09:44 AM
Now,
The problem of the brain is not being organic, but simply age due environmental scathes.
There are several theories of why aging.
Someday we will be able to remain it organic, but still ameliorated. Energy of inorganic things can be the same way stop from being producing, and even have the ability to fight it.
The human brain is much much more complicated and sophisticated than the most sophisticated computer, due it being organic.
You are talking about Artificial Intelligence (AI) maybe, and yes, that might solve a lot of problems.
Yours
~Infernity
#102
Posted 24 April 2005 - 11:05 AM
I think it's better to start again in a more object oriented way and that is to separate different themes in this discussion and send them to the appropriate forum and just keep the one I think is more related to the title of this forum " Brain-Computer Interface" ;
Now I'd like to talk about the possible already performed experiences , so real ones , for connecting human brain to the machine .
What I will begin with is the experiences of Dr Penfield ...
#103
Posted 26 April 2005 - 07:39 AM
This scientist when trying to treat some epileptic patients , removing some little parts of the brain which was located as the source of the seizure . In this process he sent some electrical signals to the brain with thin electrodes to find the tissue to be removed .
But what is important to us is an unexpected brilliant discovery made at this experience I'm going to talk about later on ...
#104
Posted 04 May 2005 - 10:23 AM
Edited by amordaad, 04 May 2005 - 10:48 AM.
#105
Posted 11 May 2005 - 06:05 PM
If we could copy a brain, paste it onto a new brain in a new body, and annihilate the original, quickly enough that the causal relationship between the properties of the original and the properties of the copy is equivalent to the causal relationship between the properties of the original at the moment of copying and the properties of the original at the moment of annihilation, then I think there is reason to believe that the copy is the same person as the original.
One ethical difficulty would be defining the equivalence of causal properties. Another would be rectifying any problems involving the experience of the original in being annihilated.
Perhaps an even bigger challenge for all involved would be accepting any change in the symbiotic relationship between the person and the environment. For instance, the body would be a new body, not the original body. It might be difficult for people to believe that the copy is the same person as the original (but eventually I think such things would be accepted, assuming this technology became available). Secondly, and in my opinion even more significant, would be the differences in the mechanics of interaction with the world. Part of what makes a person is his place in the world and society. Perhaps one could simply consider any changes to be just... changes.
One a more technical level, the new brain might operate in a dissimilar manner from the brain of the original. This might be a positive change.
I saw someone mention that this idea smells of a belief in a 'soul'. Actually, this idea does not conflict with materialism.
#106
Posted 11 May 2005 - 06:22 PM
However, the idea that you a "different person" from one moment to the next undermines the purpose of the definition of identity. Its "materialism" taken to its illogical conclusion rather than materialism (as in Dennett).
#107
Posted 11 May 2005 - 08:32 PM
Yours
~Infernity
#108
Posted 12 May 2005 - 04:00 AM
I skimmed it. Sorry if I beat a dead horse.exapted, did you go through the whole thread or just the first post, and went over?
Yours
~Infernity
#109
Posted 12 May 2005 - 10:02 AM
Ah, so I won't bother repeating, it's all there.I skimmed it. Sorry if I beat a dead horse.
~Infernity
#110
Posted 19 May 2005 - 10:47 AM
1 - All the experiences during our life is stored in the brain even if we can't remember them in usual day life....
2 - The memories stored in the brain are more detailed than usually we remember and are made of motion , colour , sounds and emotion ....
3 - These stored information may be read by sending electrical impulses to the brain ....
I think the main part of the features in our being which makes the identity and personality of us is the same stored information which is the base of even our emotions , decision making , behaviour and mostly it makes our characteristic .
So what is the result ?
I think these information should be read and changed to some other shape of encoding and then it will be ready to be pasted on a newly made brain .... [mellow]
#111
Posted 28 May 2005 - 08:25 AM
This is a link to the BBC news agency for October 8th ,1999 about a fantastic experiment .
It is taken from the essay in the Neuroscience magazine by Garrett Tanley and Fei Li . They have attached electrodes to 177 cells in the thalamus region of the cat's brain and then traced the firing of these cells when the cat had been watching and following some special algorithm they reproduced simmilar views on the monitor based upon the cells activity .
This is important because together with the previous experiment I mentioned in the last replies done by Dr. Penfield , it may help us to read the visual memory saved in the brain and to change it to images on a computer obvious for evey body to be seen .... [thumb]
#112
Posted 28 May 2005 - 10:20 AM
~Infernity
#113
Posted 18 June 2005 - 07:56 AM
and there is the text of the news from cnn.com here :
London, England -- By the middle of the 21st century it will be possible to download your brain to a supercomputer, according to a leading thinker on the future.
Ian Pearson, head of British Telecom's futurology unit, told the UK's Observer newspaper that the rapid advances in computing power would make cyber-immortality a reality within 50 years.
Pearson said the launch last week of Sony's PlayStation 3, a machine 35 times more powerful than the model it replaced, was a sign of things to come.
"The new PlayStation is one percent as powerful as the human brain," Pearson told the Observer. "It is into supercomputer status compared to 10 years ago. PlayStation 5 will probably be as powerful as the human brain."
Pearson said that brain-downloading technology would initially be the preserve of the rich, but would become more available over subsequent decades.
"If you're rich enough then by 2050 it's feasible. If you're poor you'll probably have to wait until 2075 or 2080 when it's routine," he said.
"We are very serious about it. That's how fast this technology is moving: 45 years is a hell of a long time in IT."
Pearson also predicted that it would be possible to build a fully conscious computer with superhuman levels of intelligence as early as 2020.
IBM's BlueGene computer can already perform 70.72 trillion calculations a second and Pearson said the next computing goal was to replicate consciousness.
"We're already looking at how you might structure a computer that could become conscious. Consciousness is just another sense, effectively, and that's what we're trying to design in computer."
Pearson said that computer consciousness would make feasible a whole new sphere of emotional machines, such as airplanes that are afraid of crashing.
By 2020 Pearson also predicted the creation of a "virtual world" of immersive computer-generated environments in which we will spend increasing amounts of time, socializing and doing business.
He said: "When technology gives you a life-size 3D image and the links to your nervous system allow you to shake hands, it's like being in the other person's office. It's impossible to believe that won't be the normal way of communicating."
But Pearson admitted that the consequences of advancing technologies needed to be considered carefully.
"You need a complete global debate," he said. "Whether we should be building machines as smart as people is a really big one." [thumb]
#114
Posted 18 June 2005 - 10:37 PM
Lets have a bet on it

Just kidding. Is this for real?
~Infernity
#115
Posted 02 July 2005 - 04:21 AM
A) It runs as a systematic machine, the circuits fire in an ordered pattern, where as the brain fires neurons simulteanously. Thus allowing us to multitask, computers can't. When we look for some thing or try to solve something, we might find the answer using another method, computers can only use one method of calculation.
B) There are various chemical processes and reactions taking place in the brain, a supercomputer wouldn't be able to simulate.
C) We are not systematic, our brain patterns shift and flux, a computer can't to this, because the computer language is written in zeros and ones.
However, there was a theory that took a different path, first hypothesised in the late sixties, early seventies. It was called a quantum computer, 10000 times faster and more powerful than the best supercomputer today. While a normal computer is written in zeros and ones, the quantum computer is written in zeros, ones and a superpositioned (between the one and zero, making it simulteaniously a zero and a one) digit. This allows it to multitask and support a huge capacity of space (virtual) in a small volume of space (physical).
The first practical quantum computer was invented months ago at a university as a strand of molecules in a (appears as) liquid state.
An example: The CIA uses supercomputers to make codes for encoding sensitive information. The most complex code to be developed by the most powerful supercomputer (before IBM's) took a year to calculate. A quantum computer can do the same in one whole day.
For brain emulation (brain downloading), only a quantum computer would be able to handle the enormous complexity of the human mind.
#116
Posted 02 July 2005 - 10:35 AM
Yours
~Infernity
#117
Posted 03 July 2005 - 12:09 AM
#118
Posted 03 July 2005 - 06:51 AM
For uploading such information I thought there are three main ways that one will probably apply.
[>] Or by a canner, something that could transfer the whole information the brain has. Something like a very sensitive laser thing or whatever...
[>] Or, that you'd be connected to electrodes, that will transfer the whole information the brain has as electrical waves.
[>] Or that there will be something like covering the head that works on the same idea as one of the two above options...
Well, anyway it is all must be connected to something that will have the actual ability to read it and perhaps save it, (and the next step of course also downloading it, it's just that this is much harder to to this point proceed in the process, probably the harder part among...).
Whatever this machine be, I believe it would still be called computer, because however super it is, it has the basically idea of a computer, reading data and upgrading it to code, and passing the data... It will be a program I suppose, I believe it won't be easy to work with. Maybe the first will not be much more than upgrading it all to waves, so it will be just written down as a graph. But we have started from nowhere, we build the future.
This computer will be designed specially for that cause of course.
Yours
~Infernity
#119
Posted 03 July 2005 - 08:09 AM
sponsored ad
#120
Posted 03 July 2005 - 08:34 AM
The brain would be even easier because it has lots of electrical waves, not as for example a stone which doesn't [tung]
http://www.imminst.o...689
There are differences between copying and uploading.am interested in the prospects of biological restructuring and gene-therapy (Telomere regrowth and such). Basically anything that allows the body to survive indefinitely. I also have interest in brain relocation. I am slightly skeptical about methods that directly involve the alteration of the mind (Such as uploading and digitisation) because it raises questions about whether the new individual is really you. Would it be you looking out its eyes? Thinking its thoughts? If I can help it, I would rather achieve immortality whilst preserving the sanctity of the mind.
When you upload- the whole information of yourself till the moment of ending the uploading process is being kept. After you finished uploading- your life experience is increasing, into information that you did not upload yet. Every moment means more life experience, mean a different you. So it shall have a VERY similar information as you contain but never the same exactly, as there is no 100% .
Copying- creating another you. Again, you two shall be exactly the same ONLY at the moment of finishing the process, and after that- each brain has it's new singular life experience which shall make you VERY similar, but not the same. Moreover- it will never be same person, I mean- shall never hold the same brain and so same thought and sights etcetera, it's just a copy after all. If you die- the copy shall remain as it is, alive and well, and you won't be, never was, and never will be, as for "you" have no life experience and never did- and shall never have, as not ever being born. Once your dead, the copy is not to your concern. Just another person that has in to some point the same information you WOULD have in the same point if you wouldn't be dead.
Hope it's clear enough.
Yours
~Infernity
Yours
~Infernity
1 user(s) are reading this topic
0 members, 1 guests, 0 anonymous users