• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
- - - - -

Uploading... would you do It?


  • Please log in to reply
122 replies to this topic

Poll: Uploading... would you do It? (225 member(s) have cast votes)

Uploading... would you do It?

  1. Yes, I would upload. (144 votes [66.06%])

    Percentage of vote: 66.06%

  2. No, I don't want to upload. (30 votes [13.76%])

    Percentage of vote: 13.76%

  3. Maybe. (44 votes [20.18%])

    Percentage of vote: 20.18%

Vote Guests cannot vote

#61 armrha

  • Guest
  • 187 posts
  • 0

Posted 12 January 2005 - 08:38 PM

It's unlikely that the future substrate where an upload could run on would be made of anything like transistor. The likest substrates in the near future would be either optical or molecular, maybe both.


Actually... the hardware to upload onto isn't as important as the software for running an upload. I mean, any turing machine with a sufficiently large tape could run us, with the right rules for the brain... it just would be very very slow.

If a thousand generations of millions of mathematicians worked out all the variables of your sensory input and your consciousness's output on pen and paper over millions of years, would it feel the same?

I'd have to say yes...

#62 Karomesis

  • Guest
  • 1,010 posts
  • 0
  • Location:Massachusetts, USA

Posted 13 January 2005 - 04:11 AM

Allow me to propose some thoughts on the matter. I beleive as was previously mentioned, that the web of biological and non biological systems in our lives will become increasingly interwoven. Uploading in the truest sense of the word is something I am quite interested in and will participate as soon as technology allows.If opportunity presents itself in no other fashion save ginie pig status, then so be it, I will offer myself as fodder for scientific progress,while hopefully becoming one of the first individuals to experience multiple forms of conciousness [:o]

My ultimate goal is omnipotence and uploading and various other forms of non biological augmentation are a beggining to my path of exploration into the deepest realms of the cosmos, it only seems reasonable to start with the most profound inner machinations of my concious.

sponsored ad

  • Advert

#63 armrha

  • Guest
  • 187 posts
  • 0

Posted 14 January 2005 - 07:35 AM

Allow me to propose some thoughts on the matter.  I beleive as was previously mentioned, that the web of biological and non biological systems in our lives will become increasingly interwoven.  Uploading in the truest sense of the word is something I am quite interested in and will participate as soon as technology allows.If opportunity presents itself in no other fashion save ginie pig status, then so be it, I will offer myself as fodder for scientific progress,while hopefully becoming one of the first individuals to experience multiple forms of conciousness [:o]

My ultimate goal is omnipotence and uploading and various other forms of non biological augmentation are a beggining to my path of exploration into the deepest realms of the cosmos, it only seems reasonable to start with the most profound inner machinations of my concious.


You're wonderfully honest. I like that. The power of a transhuman future is incredibly appealing, and the thought of being able to supplement your intelligence with vast knowledge and spread out across the stars is amazing. I think that to many people, though, omnipotence is a pretty threatening term. I personally want to live safely backed up and retundantly stored in an uploaded form thousands of years from now... but omnipotence? That would imply, among many other things, the capability to hurt or destroy others. That's one thing I don't support, though I'm not saying you do.

On that note, one thing I don't think is very well mapped out, is how an uploaded society would function. Though it's definitely getting ahead of ourselves, I'd like to see what we could come up with. It would, of course, have to combine elements of hardware, software, and userspace management. To design a system robust enough to supports millions of people would be quite a challenge.

I'm assuming two things. One, artificial intelligence produces the capability for sentient copies of sentient creatures to be made, and accurate test for sentiences exists that by checking certain points in a complex program can determine if it's sentient or not. (That's a big assumption, but one I hope is true.) I'm also assuming artificial intelligence will produce non-sentient but highly intelligent 'robots' that can be made to perform things (without suffering or tedium, with no more sentience than a word processor) that we'd rather not do, or just aren't fit to do.

I think first every upload would start in their own basic environment, with almost unlimited priviledges within that area and their computing allowance. Each realm or reality or whatever you want to call it would have to be completely customizable by the user. There would be one limit on what you could do within your world. More on that in a bit.

I think first off there would be a distinction between the sensorium and the consciousness of the uploaded mind. The sensorium would be basically like a monitor of a very high resolution with multiple senses. People should have the capability to program their own user interface modifications to the 'monitor', and control their inputs and outputs completely. To visit another person, you would give them a knock or a message or, if prior permissions had been worked out, just pop right into their sensorium. Your sensorium would draw their environment relative to yours, much how the real world works right now when in a room with someone. While the projection and the senses and thus the consciousness of the person moves around, it would 'really' always be safe within the 'home' directory or environment of the process of your mind.

This is very telling philosphically of an uploaded world. It would probably be impossible to damage another person in the virtual world. If all else fails, even if you are shouting at them, they can just ignore you. If they don't give you access to whatever environment they are in, you can't get to them. This prevents any violent crime from being perpetrated against each other. Some days, you may not feel like doing collision detection, even. Just walk straight through the crowd. Maybe that would be a faux pass, in the future.

Objects, in a digital environment, could possibly be hyperlinked much in the same way links are drawn nowadays; just attach a text address to the object and it can jump you right too it. You could program (or merely install) certain aspects into your sensorium, like, say an object you come across has a book linked to it; your user interface could be set up to automatically clone a real paper-bound feeling book, with the next. Or to drop it in your knapsack, or to just put it up on a panel invisible to everyone else in the environment a few feet in front of your head.

Users should be able to tinker with their brains however they want, though this would be viewed as extremely dangerous. Users should also be able to reprogram their basic sensory apparati to operate however they want.
Knowledgebases should be ready for import or more traditional learning.

The most important part is that no other citizen in the whole society should have any power over any other citizen in any real, physical way. No one should be able to delete anybody else, no one should be able to force sensorium elements onto anyone else.

The only limitations I can see are to ensure that all citizens get an equal amount of processing time. This is a toughie. While many worlds probably wouldn't require much simulation short of what needs to be broadcasted to the sensorium, I guess it really depends on how much processing power is available. If the computer continually builds itself as it needs more distributed processing, then maybe it wouldn't be an issue. As it stands, everyone would have to deal with absolutely the same resources, in the abscence of any life-threatening resource concerns. Of course, before the whole world goes into computers and people are still buying their way into digital environments, they'll buy their rights and powers and spaces in specialized systems, but when all the resources members of the computing world could ever want are produced by 9 retundant fusion reactors maintained basically for free by robots and non-sentient but intelligent enough to run it artificial intelligence? Without power to exert over anybody or need for anything, I think economic systems fall apart too. This means war, murder, governments, economics (at least as it stands today; I suppose there would still be a knowledge economy, or an experience economy... I think a lot of study would have to go into it to really detail this one.), involuntary death, taxes, murder, rape... all would be 'obselete' and impossible.

There is a couple opportunities for crime though... hopefully solutions can be found.

A citizen could, under his own environment, take the code that created him and produce another person, which he could exert complete control over. This is madness and terribly frightening. This is why a comprehensive test for sentience would have to be developed; it could test programs being run on userspace for sentience at predefined intervals and automatically grant citizenship to any new citizens created. Possibly, if resources are very limited, it could keep a program designed in the same way as a program that could develop sentience from running (provided reproduction in the first place is under current restriction). This kind of reeks of the halting problem, but maybe a specialized system for solving this particular case could pick out certain actions and systems behaviour neccessary for it. It would probably be a huge list of criteria.

The second problem is expansion. A colony under these rules would certainly have to have a window to the outside world. It may have to migrate galaxies to survive, or establish retundant backup systems in a different solar system, or any number of things. It would have nano-factories connected to it, but how do we decide who runs them? Everyone can't at once, and we wouldn't want everybody to be able to produce anything as a window into the real world. I imagine the colony would already have quite a few inlets into the real world, as teleprescence robots and telescopes views set up as environments to walk around in. Reaching outside of the spectrum of the internal world of the machine carries inherent risks. While the colony would be retundant and somehow protected, we wouldn't want every resident creating a nanomachine to try to vandalize the computing machines. But I am not comfortable with just like, making a list of people who can be free to get out of the simulation; this seems horribly unfair. I don't see an adequate solution to this problem yet. Perhaps a non-sentient but intelligent Systems Administrator would have access to the upper level of everyone's minds, and be able to tell without a doubt what that person's intentions are, the ultimate Truth Machine. People who want to destroy the colony would be allowed to use the machines, but not produce anything that could be used to destroy the colony. It would have to be one hell of an expert system though. I would have my doubts for a long time.

Any other suggestions or solutions? I would love to put together a document detailing what work (that we can immediately understand and work out) that needs to be done. After all, it is new ground; the society of the future. We need a way for people to find each other, and community areas to socialize... I begin to worry about it as the population in concept approaches the millions and millions of personalities.

#64 Infernity

  • Guest
  • 3,322 posts
  • 11
  • Location:Israel (originally from Amsterdam, Holland)

Posted 18 February 2005 - 10:02 PM

Yes! I would!
Sometimes I just feel like too bad I cannot inject myself into someone so he will simply understand everything!
I would upload myself, then I will have to explain nothing...

Yours
~Infernity

Edited by infernity, 11 March 2005 - 02:54 PM.


#65 rachel41

  • Guest
  • 4 posts
  • 0

Posted 03 March 2005 - 02:15 PM

Why can't we upload data from our brains, and remain intact, but allowing our information to be shared from our brains. It would be ideal with people with neurological conditions like mine that rather than using the keyboard or keyboard simulation could interact using the brainwaves and computer link?

Call me naive, but uploading does have other potentials without deleting the physical remains.

#66 kraemahz

  • Guest
  • 157 posts
  • 0
  • Location:University of Washington

Posted 03 March 2005 - 05:10 PM

Rachel,

That will be the most common usage of this technological field as it first becomes available, but the problem is that currently there exists no easy deep brain scan technology, external caps can only read surface brain waves, implanted probes can only interact with the cells directly around them.

But the purpose of uploading isn't to communicate with machines, it's to pull biology out of the equation for conciousness. People could want this for any number of reasons: machines can think much faster than humans, store more information, and do it with little error, machines can be turned back on if they run out of power, machines can store back-ups of information (makes it a lot harder to accidentally get wiped out, especially if some physical machine you were stored on got damaged). For someone who feels their mind as well as their life should be boundless, uploading is a great lifting of boundaries.

You should look around the Brain-Computer Interfacing forum for info (or this very thread) if you're interested. There are several levels of commitment to being connected with technology, uploading is the highest.

#67 Infernity

  • Guest
  • 3,322 posts
  • 11
  • Location:Israel (originally from Amsterdam, Holland)

Posted 11 March 2005 - 03:00 PM

As for zoysite's signature

"I am not at all interested in immortality, only in the taste of tea."
--Lao Tzu

, heh a smart line, as you have to live forever to have anything, so the taste of tea. If you shall have it and die- you never had it, as no awareness that never were too.
Sorry for going out of topic, it was just it, it didn't worth a new thread right?

~Infernity

#68 armrha

  • Guest
  • 187 posts
  • 0

Posted 22 March 2005 - 04:56 PM

Yes! I would!
Sometimes I just feel like too bad I cannot inject myself into someone so he will simply understand everything!
I would upload myself, then I will have to explain nothing...


There are a lot of complications for this, too. I don't think a lot of people realize how hellish the world of the first uploads could be without the proper guidelines and safety protocols. All the suffering in the world to this point added up is nothing compared to what an malevolent person could do with an upload that he or she owned the hardware on and ultimate permissions on it. They could bring you closer to the christian ideal of hell then anyone on Earth has ever suffered through... or worse. Uploading is a great idea with so much potential but a lot of people assume that all the problems are just going to be solved, like this. Instant communications won't be possible right away in any capacity. For one, this would require direct access to someone elses neural states; a horribly dangerous thing, would you walk around handing out root access to your mind? If they can control your perceptions or experiences, it's feasible to think they could find some set of them to control you absolutely. I would never let you transmit your thoughts directly to me unless there was a method of dissecting exactly what you were going to do. I'll stand by safe, slow, but dependable verbal communication-- people get brainwashed by that enough already, afterall.

I would wager that, in addition to running at a speed much slower than real life, the first uploads will still have basically no idea how their brain works. It'd be like having a big binary running your brain, far too large to dissect or dissassemble in any meaningful way for the first uploads (if run from biological simulations). So the dangers of manipulating your brain (that you don't even understand yet) would be incredible, who knows what you might break? The upload equivalent of a lobotomy would be repairable, true, but still. In order to have your communication, you'd have to decode memory and be able to implement a loose memory onto others, understand emotional states and be able to implement them on someone else, etc, etc.

And the worst part is, when you are arguing with someone and do this, yes, they would understand and agree with you. You would have given them a 1% chunk ofinfernity and now they agree with your opinion. Why wouldn't they? They understand the reasoning, and feel very strongly that it's true. How many times do you have to do this before you two stop being two people and start being one? How would you decide who gets to share and who just has to recieve? Or would you both trade at once, and then be stuck arguing again, just opposite sides?

I would hate it if everytime I started arguing with DonSpanton, he could just implement his thoughts on my brain...

#69 Infernity

  • Guest
  • 3,322 posts
  • 11
  • Location:Israel (originally from Amsterdam, Holland)

Posted 22 March 2005 - 05:12 PM

Well armrha, I can't wait for that dystopia [sfty] .
Mmm, looking forward to that, perhaps I may become a mind-hacker [tung] .
Refer to it as a computer- we can decide to what information people have access.
Oh, so divine is the future game improvement! [lol] .

Yours truthfully
~Infernity

#70 armrha

  • Guest
  • 187 posts
  • 0

Posted 22 March 2005 - 05:20 PM

Well armrha, I can't wait for that dystopia [sfty] .
Mmm, looking forward to that, perhaps I may become a mind-hacker [tung] .
Refer to it as a computer- we can decide to what information people have access.
Oh, so divine is the future game improvement! [lol] .


It will be nice for the option to be there, I'm certaintly not arguing against the technology being neccessary for the future of our race. I am just worried about structure. We need all uploads to be utterly autonomous, totally unsinkable, except from within (at least to an incredible degree of probability). I don't know of any system in existance today that fits that bill.

The thought of a mind-hacker upload chills me to my very core.

#71 Shannon Vyff

  • Life Member, Director Lead Moderator
  • 3,897 posts
  • 702
  • Location:Boston, MA

Posted 21 February 2008 - 10:28 PM

I'm surprised that yes is overwhelmingly in the lead :).

#72 treonsverdery

  • Guest
  • 1,312 posts
  • 161
  • Location:where I am at

Posted 22 February 2008 - 12:22 AM

Method three
under clever circumstances each atom of your being can be isolated physically while being piped equivalent physics to its previous neighbors; then this atom can be surrounded with other atoms which are responsive to the core atom; as the areas between the source atoms become wider there is more n more option to create varied patterns of possible thoughts that could be; these could be screened with Hippy Vegetarian Jain AI to seee if they are adequate compared with the source neuroatomic basis of thought then given a go

isolating each atom of a being while piping it equivalent physics can be done numerous ways quarks are a possibility, as is any future 11 space brane technology, plus theres a keen way thats just not passing the tinfoil hat test

its a little like making each atom a brain in a jar then surrounding it with more jars that are absent being; running pleasing simulations on the sidecar jars; then just perhaps wobbling the source atoms to run the Beings preferred program

that way you stay you plus have your cybercake too

right now my thought about uploading Treon is: what effect does this have on the "Il" or "It" pattern which is similar to jungian synchonicity; I prefer the creators orginal version

I think humans should talk about the jungian pattern as an actual thing to be changed this video starts with an "Il" written on a street, plus has attitudes different than Treon's about elephants n televisions

Edited by treonsverdery, 22 February 2008 - 12:38 AM.


#73 Brainbox

  • Member
  • 2,860 posts
  • 743
  • Location:Netherlands
  • NO

Posted 22 February 2008 - 09:18 AM

Voted yes, I would like to meet myself in order to understand this strange individual a bit better.... :)

#74 JohnDoe1234

  • Guest
  • 1,097 posts
  • 154
  • Location:US

Posted 19 May 2008 - 08:00 AM

Voted maybe since it all depends on how it is done.

I would prefer it to be a gradual process... though I do consider myself a patternist, I would prefer integration with the technology over being copied into another substrate.

EDIT: left out a word.

Edited by Joseph, 19 May 2008 - 08:01 AM.


#75 mentatpsi

  • Guest
  • 904 posts
  • 36
  • Location:Philadelphia, USA

Posted 07 June 2008 - 09:47 AM

damn... how does uploading really make sense anyways... just because you can locate your consciousness to one specific organ doesn't mean you can transfer it merely by copying it. You can't make an 100% copy anyways, therefore you shouldn't be able to transfer consciousness through any means other than moving the brain... maybe i am missing some crucial piece of information?

Are we talking about transferring brains into cyborg bodies? That would be pretty cool ;)

Edited by mysticpsi, 07 June 2008 - 09:50 AM.


#76 vyntager

  • Guest
  • 120 posts
  • 2

Posted 15 June 2008 - 10:36 PM

damn... how does uploading really make sense anyways... just because you can locate your consciousness to one specific organ doesn't mean you can transfer it merely by copying it. You can't make an 100% copy anyways, therefore you shouldn't be able to transfer consciousness through any means other than moving the brain... maybe i am missing some crucial piece of information?

Are we talking about transferring brains into cyborg bodies? That would be pretty cool ;)


Well it's copying of sorts. Nobody knows how to make a conscious intelligence, so obviously for now the only way you could upload someone would be by making a perfect copy of their brain. You would just as well keep the former one then.

But the rationale behind uploading is that not everything in your brain is
- useful to make the sentient being that's you work and live, so you could reasonably throw some stuff away
- efficient, that is, what remains that is useful or even necessary, but that we could maybe implement better with something else than a brain.

In the worst case, you can't throw anything away, nor compress or redesign a human consciousness without loosing something important in the process. Seems safe to assume it won't be the case though, as there's already quite a diversity of people out there whose brain lack this or that bit, but are still human, for all we know. Beyond that, who knows ?

Here's a thought experiment : suppose in 20 years it becomes possible to upload, but what upload would mean then, is that the only thing preserved is your intelligence, that is, the upload would, in any situation, have the same opinion, perform the same action, know the same things as you do. But it wouldn't feel, wouldn't even be conscious.

Think about it as some sort of chinese room, or a giant lookup table. How you'd go about creating such a thing, would, maybe, be by means of recording everything you do, see, hear, say, write, think even, if possible (lifelogging), and then compute the simplest system that would have done everything in the same way given the same stimulus, and that would also behave, for everything else, as a normative human being.

Would you do it ?

#77 VictorBjoerk

  • Member, Life Member
  • 1,763 posts
  • 91
  • Location:Sweden

Posted 15 June 2008 - 10:45 PM

It is a bit odd to think about since intelligence is so complicated and so many factors play part.and how you act is largely a part of how your mood is and not intelligence...

#78 nanostuff

  • Guest
  • 17 posts
  • 0

Posted 24 June 2008 - 07:55 PM

just because you can locate your consciousness to one specific organ doesn't mean you can transfer it merely by copying it.


Yes, it does.

You can't make an 100% copy anyways


True, but that's not necessary.

maybe i am missing some crucial piece of information?


You seem to have all the information you need, you're just misinterpreting it terribly. Nothing I can tell you will convince you otherwise, I've tried many times. I'll only say that a functionally equivalent copy is functionally the same. Since your consciousness is derived from function, it's the same consciousness. If you don't get that, I can't help you.

#79 Heliotrope

  • Guest
  • 1,145 posts
  • 0

Posted 01 July 2008 - 12:36 AM

Yeah I'd do it.

But suppose my brain is copied 99.9999999% or whatever and uploaded or made a 2nd organic brain in a cyborg body , how Do I Reconcile the fact that I have got two brains now , one digital and uploaded and one organic and in me waiting to die biologically? Won't there be a conflict? like the freak guy with two necks and two heads? How will they ever agree on any major things , like a financial decision? Truely they'd think the same way but the uploaded brain would have accumulated the entire wealth of knowledge on the internet or whatever net linked to it and it'd know much more than the old me left behind, it would theorectially make better decisions.

Wouldn't there be conflicts btw the TWO Consciousness?? What bout Quantum Entanglements? would it be like SPLIT PERSONALITY? The digital me would be like a God god , see-all, know-all to me right? He'd be made in MY IMAGE, but then He turns against his owner and creator? even possibly Kill ME to resolve the conflicts (like killing the twin/clone/original copy)? We two would diverge too much to reconcile anything. What then?

Edited by HYP86, 01 July 2008 - 12:44 AM.


#80 nanostuff

  • Guest
  • 17 posts
  • 0

Posted 01 July 2008 - 07:01 PM

Just don't "diverge". If you split yourself and go off in your own ways, you're not responsible for each-other anymore. Transfer yourself rather than multiply.

#81 vadim

  • Guest
  • 19 posts
  • 0

Posted 23 July 2008 - 09:45 PM

I wanted to kick off this discussion on uploading... what do you think about it. Here's a primer by Mike Deering.

----
Uploading, the process of changing the material substrate of your mind from the biological neuron based architecture to a computer transistor based architecture. The biological substrate is evolutionarily designed, the computer substrate is intelligently designed.

There are several different approaches to accomplish this feat. All of them involve scanning your biological substrate and making a functionally accurate computer software substrate with all of the information in the original. In order to make an accurate functional copy it may be necessary to scan and duplicate the entire biological body. There are a lot of interrelated processing functions built into the body which you would want for greatest accuracy. Although, you could get the vast majority of your mind by just scanning your brain, it just depends on how accurate you want the copy to be. Let's assume the best possible copy. You would need to place this virtual you in a fully interactive fully detailed virtual environment for it's proper functioning. If this virtual you were completely accurate it would have all of the physical aches and pains of the original. If the original had a heart attack fifteen minutes after scanning then the virtual you should have the same heart attack, even including death. The big improvement of the virtual you is that making design changes to it should be theoretically easier. After mature nanotechnology this distinction may be moot. The two basic approaches to uploading are:

Method One - we passively scan the biological you and make a computer you. Now we have two of you. We can delete the original and call the process a success. End result, the virtual you has the subjective experience of having moved from the biological substrate to the computer substrate. But this is not acceptable to most people. Alternatively, we can establish communication between all parts of the two yous so that your subjective experience is that you are simultaneously inhabiting both substrates and let you handle deleting the original, seeing as being indefinitely tied to the original biological substrate would completely invalidate the reasons for uploading in the first place. End result, the virtual you has the subjective experience of having moved from the biological substrate to the computer substrate.

Method Two - we gradually scan and replace your biological substrate with the computer substrate. The end result is the same as method one.

Many people feel squeamish about this uploading stuff. It brings up several interesting questions such as "what are we?" "Why are we afraid to upload?" After a long process of elimination which I won't repeat here (unless you want me to) I think the question of what we are can be summed up as a pattern of information. We know from our experience with computers that patterns of information can be copied, stored and edited, and in the case of a program be run multiple times and places with varying inputs.. We are not used to thinking of these processes as applying to us. Our problem with them comes from two sources, survival instinct and our unitary experience of consciousness. Our survival instinct is evolutionarily programmed. This individual pattern wants to continue to exist. That's part of the information in the pattern. If we made five copies of the pattern, each copy individually would want to continue to exist. The fact that an identical pattern continued to exist may be comforting but does not completely satisfy the desire for survival. This information is certainly editable, so you could theoretically change it. The second source of our unease is our unitary experience of consciousness. If our consciousness were not unitary but multiple perhaps we would be less apprehensive about losing one or two of them as long as others continued. But this can not be. Consciousness is necessarily unitary. If there were two parts of me that were not aware of each other they would experience consciousness unitarily. If they were aware of each other the part that was aware of both would form a bridge between them unifying their conscious experience. This is not something we can edit out. It's topology, it's mathematics, it's a fundamental characteristic of consciousness. Therefore it seems that some form of continuity of conscious experience is necessary for a successful uploading procedure. As long as this individual pattern of information exists regardless of the transformations it goes through then I will continue to exist.

---


Posted Image



May be we can develop large-brain computer simulation (like "Blue Brain" project) as a template and then use optical tomography to upload all parameters of someone's brain :~
the problem, that detail tomography (at cellular level) will alter proteins and probably will destroy the original brain. So, we will have only one copy but not the biological original.

#82 Matt

  • Guest
  • 2,862 posts
  • 149
  • Location:United Kingdom
  • NO

Posted 23 July 2008 - 10:48 PM

In preserving ones own consciousness and still being you that survives, I can't see there being any other way than building upon the platform we already have which is [the brain]. A simple gradual upgrade by replacing various areas of the brain would offer a more seamless transition, and would an easier transition of 'self' rather than making another copy, which is not you. I can imagine that the brain enhancing technologies will not only work with our existing brain at first, but the end result will be that they take over functions that your brain used to do... Uploading your brain/mind and letting the upload live on while I die would be absolutely pointless. Who cares if a copy of me lives on, I WANT TO LIVE ON!

Edited by Matt, 23 July 2008 - 10:59 PM.


#83 nowayout

  • Guest
  • 2,946 posts
  • 439
  • Location:Earth

Posted 11 October 2009 - 10:27 PM

Uploading... would you do It?



Only to xtube...

#84 Esoparagon

  • Guest
  • 227 posts
  • 32
  • Location:Australia

Posted 12 October 2009 - 09:00 AM

Uploading isn't escaping death at all really. It'd be like seeing yourself all of a sudden from this computer program, you'd feel the same but now you're in a computer, but the original you on the outside would still be on the outside waiting to die. You just copy yourself into a virtual program. I suppose it's better than nothing so I might do it... if I thought the world could use my brain to its benefit, but it's not really a good option for escaping death.

#85 Singularity

  • Guest
  • 138 posts
  • -1

Posted 09 November 2009 - 04:43 AM

It depends on what kind of upload it is. If it's for an experiment, and I could guarantee my privacy, then yes as I would not consider the upload version of me to be sentient and therefore I would not have to worry about it's discomfort during or after the uploading experiment.

I'm an anti-theist (if that matters) but I do believe there is something unique about us that makes us sentient and I can't see how super human intelligence, even a copy of my regular human intelligence, can be sentient. I think it has something to do with being atomically integrated into the universe.

On the other hand, if there was a way to physically and slowly replace all of my brain cells with more durable analogs that function 100% or better than the organic versions safely, then yes, that would be most ideal... at least for starters.

But, I would rather keep my organic body intact surrounded by a protective exoskeleton so I can continue to enjoy the sins of the flesh :|?

#86 27GV

  • Guest
  • 17 posts
  • 0

Posted 30 November 2009 - 08:30 AM

Well, will it just copy my memories or will it actually transfer my consciousness. Or will I occupy machine brain and organic brain at the same time and make them both explode as they try to reconcile the various stimuli.

#87 Luna

  • Guest, F@H
  • 2,528 posts
  • 66
  • Location:Israel

Posted 30 November 2009 - 02:25 PM

I'm surprised that yes is overwhelmingly in the lead :-D.


Me too.

I voted no for the risk it isn't you.
I do believe in the method of inhabiting neuron cells/chips and slowly integrating with it might work, but I don't like many other theories, some of them just sound like copy paste and not the real thing. Sometimes worse, cut and paste.

#88 Moonbeam

  • Guest
  • 174 posts
  • 0
  • Location:Under a cat.

Posted 30 November 2009 - 07:28 PM

I was having a problem with uploaded version not really being you, but the answer is in the first post. I've been looking for this for a long time. (Doh, it's obvious now that I read it.)

Alternatively, we can establish communication between all parts of the two yous so that your subjective experience is that you are simultaneously inhabiting both substrates and let you handle deleting the original, seeing as being indefinitely tied to the original biological substrate would completely invalidate the reasons for uploading in the first place. End result, the virtual you has the subjective experience of having moved from the biological substrate to the computer substrate.



#89 27GV

  • Guest
  • 17 posts
  • 0

Posted 03 December 2009 - 02:47 PM

Well the best application of "mind uploading" I could see is shifting your mind to a small computer like thing somewhere inside your head, thus allowing for it to a) reside there in your body b) be able to be shifted to a new biological body and hooked up or c) be placed in a mechanical artificial body. Also, what would be there to prevent someone "hacking" your consciousness and flooding your being with spam for penis enlargements or even some form of mind control. Would I constantly have to install updates and have annoying reminders to register my spywear protection?

sponsored ad

  • Advert

#90 Connor MacLeod

  • Guest
  • 619 posts
  • 46

Posted 04 December 2009 - 05:25 AM

Alternatively, we can establish communication between all parts of the two yous so that your subjective experience is that you are simultaneously inhabiting both substrates and let you handle deleting the original, seeing as being indefinitely tied to the original biological substrate would completely invalidate the reasons for uploading in the first place. End result, the virtual you has the subjective experience of having moved from the biological substrate to the computer substrate.


That might be a sufficiently bizarre experience to circumvent a person's natural survival mechanisms, but I see no reason to believe that a voluntary "deletion" would be anything other than a delusional person committing suicide.

Edited by Connor MacLeod, 04 December 2009 - 05:27 AM.





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users