• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
- - - - -

Immortality only 20 years away


  • Please log in to reply
179 replies to this topic

#121 Mind

  • Life Member, Director, Moderator, Treasurer
  • 19,055 posts
  • 2,000
  • Location:Wausau, WI

Posted 27 October 2009 - 05:16 PM

I'm sorry but without clear references and sources I can't accept Mr. Pinker's statements.


I am sorry, I don't have the reference list for his presentation, but it was referenced and he mentions specific data points in the video, so it is out there. Your reaction is that same as most people - disbelief. The anthropological data he collected indicates that during biblical times there was a 60% chance a man would die at the hand of another man. That percentage has steadily declined during the intervening centuries and is much less than 1% today.

#122 Custodiam

  • Guest
  • 62 posts
  • 3
  • Location:Hungary

Posted 28 October 2009 - 11:52 AM

I am sorry, I don't have the reference list for his presentation, but it was referenced and he mentions specific data points in the video, so it is out there. Your reaction is that same as most people - disbelief. The anthropological data he collected indicates that during biblical times there was a 60% chance a man would die at the hand of another man. That percentage has steadily declined during the intervening centuries and is much less than 1% today.


Well, I don't really disagree with this assessment, because the technological development of society, the legal system, bureaucracy etc. have a very beneficial effect on the crime rate too.

My point was that technology in itself can cause more causalities because of new weapons.

So my point really is about negative side-effects of technology.

I don't think that a technological singularity will be a heaven on Earth.

What if there are two or three technological singularities fighting for resources?

Can you imagine a war between technological singularities?

So my problem is that evolution cannot exist without violence because it can be very tempting to use violence in order to gain resources.

So the technological singularity should encompass all of humanity or there would be competition and hence violence between the different singularities.

#123 Mind

  • Life Member, Director, Moderator, Treasurer
  • 19,055 posts
  • 2,000
  • Location:Wausau, WI

Posted 28 October 2009 - 06:29 PM

So the technological singularity should encompass all of humanity or there would be competition and hence violence between the different singularities.


I agree. New technologies need to be developed in the open.

sponsored ad

  • Advert

#124 Luna

  • Guest, F@H
  • 2,528 posts
  • 66
  • Location:Israel

Posted 28 October 2009 - 07:39 PM

In response to KalaBeth about the singularity being "the new normal" and how the web has..

I kinda grew up as the web grew up.
I think I was about 8 or 9 when we started having internet, I think already then I was annoyed at it being slow and not as smooth it is now today, but rarely thinking "oh wow, it's the web!"
It's kinda like a new toy given to play with, it's not serious, it's just useful.

Maybe it was the age, maybe it's just the generation.
Me and most people I know just seem to "get along" with the devices, and old ones, we understood one, we pretty much understand all of them.

It's just learning about truly new devices that counts, and truly new ones don't come out yet, they are always based on previous ones.

Edited by Luna, 28 October 2009 - 07:41 PM.


#125 ben951

  • Guest
  • 111 posts
  • 15
  • Location:France

Posted 28 October 2009 - 07:39 PM

I am sorry, I don't have the reference list for his presentation, but it was referenced and he mentions specific data points in the video, so it is out there. Your reaction is that same as most people - disbelief. The anthropological data he collected indicates that during biblical times there was a 60% chance a man would die at the hand of another man. That percentage has steadily declined during the intervening centuries and is much less than 1% today.


Well, I don't really disagree with this assessment, because the technological development of society, the legal system, bureaucracy etc. have a very beneficial effect on the crime rate too.

My point was that technology in itself can cause more causalities because of new weapons.

So my point really is about negative side-effects of technology.

I don't think that a technological singularity will be a heaven on Earth.

What if there are two or three technological singularities fighting for resources?

Can you imagine a war between technological singularities?

So my problem is that evolution cannot exist without violence because it can be very tempting to use violence in order to gain resources.

So the technological singularity should encompass all of humanity or there would be competition and hence violence between the different singularities.


To me the singularity also mean the end of scarcity.

Why engage in a risky fight for resources when you can create resources at will ?

In a post singularity world, nanotechnology is mastered we can create anything we want for free, the limit might eventually be the mater in our solar system, but It seems to me that traveling to another solar system is still less risky than a fight between post singularity groups.

We could also live in "virtual" reality for saving energy and matter.

I see more danger before the singularity when individual or small terrorist group have bigger and bigger destructive power but scarcity still exist.

The war might happen for philosophical issue like religion, culture differences etc but i don't think it will be a fight for resources in a post singularity world.

Edited by ben951, 28 October 2009 - 07:48 PM.


#126 Custodiam

  • Guest
  • 62 posts
  • 3
  • Location:Hungary

Posted 29 October 2009 - 08:08 AM

My other problem is: OK, we are now (in twenty years time) immortals.

It costs a lot of money to maintain immortality.

Obviously rich nations can provide it, but as you can see with health care, even in rich nations it probably won't be universal.

So there are the immortals and on the other side the poor mortals.

Can you imagine the envy of mortals? Can you predict the rage after some people (possibly hundreds of millions of people) will be denied of eternal life?

If you have the technology of immortality, to deny it from a human means to kill that human being.

So either there is a Utopian society on Earth with unlimited resources or immortality will star the mother of all civil wars.

How realistic is a Utopian society? How can a society work without competition?

Isn't the technological singularity a "communism 2" project with more technobabble?

Edited by Custodiam, 29 October 2009 - 08:12 AM.


#127 Custodiam

  • Guest
  • 62 posts
  • 3
  • Location:Hungary

Posted 29 October 2009 - 09:31 AM

Correction (sorry):

So either there is a Utopian society on Earth with unlimited resources or immortality will start the mother of all civil wars.

#128 Berserker

  • Guest
  • 60 posts
  • 0

Posted 29 October 2009 - 10:29 AM

That’s why Custodiam I see immortality as something impossible in the short term (the next 100 years). The problems that immortality will create are too big. Overpopulation, wars, ethical problems, etc...we are not ready. Also, did anyone think about have for example a dictator for thousand years?

I think that even if we have the technology, we won’t be able to use it.

#129 Luna

  • Guest, F@H
  • 2,528 posts
  • 66
  • Location:Israel

Posted 29 October 2009 - 10:39 AM

That’s why Custodiam I see immortality as something impossible in the short term (the next 100 years). The problems that immortality will create are too big. Overpopulation, wars, ethical problems, etc...we are not ready. Also, did anyone think about have for example a dictator for thousand years?

I think that even if we have the technology, we won’t be able to use it.


Obviously we did think that, and if we didn't, there are people every corner there just to remind us.

There been many conversations and many threads about this, as well as people showing their frustration that this is the response they get from the typical person.

It is possible, but it's hard, it's complicated, it has to be done right.

Dictator is the last thing I believe that will happen, taking into account the way most developed countries are like nowdays.

What worries me is more like availability (which can cause civil wars, maybe), resources and stupid discussions "should we even use it? is it right for humans to live forever?" which just make my mind stress "omg, should you let a human die?", that's so stupid, don't work against it, work for it and work for making the rest of the problems solved also!

#130 VidX

  • Guest
  • 865 posts
  • 137

Posted 29 October 2009 - 11:09 PM

I guess some ppl tend to concentrate on obstacles, other on possibilities.. The later usually has a bigger chance to succeed. The whole "why this or that is impossible" seems futile and irrelevant to me. We need to strive to make it possible, how? well we already have a problem, some say it's a half of the solution.

#131 KalaBeth

  • Guest
  • 100 posts
  • -3

Posted 30 October 2009 - 04:39 AM

My other problem is: OK, we are now (in twenty years time) immortals.

I suspect that's a highly optimistic timetable, and "negligible aging" is not "immortal" by a long shot. But for the sake of argument....

It costs a lot of money to maintain immortality.

At first, yes. Depending on the technology, it will almost certainly get (by first world standards) progressively less expensive as time goes on.


Obviously rich nations can provide it, but as you can see with health care, even in rich nations it probably won't be universal.

Almost certainly true. Aubrey deGrey is optimistic that since treating age-related illness is expensive, it will be publicly paid for.
Leaving aside the policy argument for the time being, I'm not convinced that's true. I think it depends on the form the therapies take. If it's something you have to see tons of doctors and sink lots of time and resources in, no. If it could be delivered by a near-ubiquitous swarn of nano-whatsit machines at low cost, then *maybe* not. But the former's much more likely that the latter for a good long time, yes.

So there are the immortals and on the other side the poor mortals.

Certainly for the first few generations of the technology. Longer with legal restrictions, which may well pop up.

Can you imagine the envy of mortals? Can you predict the rage after some people (possibly hundreds of millions of people) will be denied of eternal life?

Oh, I'm sure the envy (and in some cases rage) would be intense. But see below.

If you have the technology of immortality, to deny it from a human means to kill that human being.

The same could be said of any number of life-sustaining resources. It's a morally admirable thing to donate to a food bank, but you're hardly culpable of murder if you prefer not to spend your entire paycheck on African famine relief.

So either there is a Utopian society on Earth with unlimited resources or immortality will star the mother of all civil wars.
How realistic is a Utopian society? How can a society work without competition?

Isn't the technological singularity a "communism 2" project with more technobabble?


To repeat, we're already in that position. We already live in a heaven hardly imaginable by our ancestors, and have a standard of living many people in many parts of the world truly can't comprehend.

As an example... "Gluttony" isn't on the list of "Deadly Sins" because the medieval church didn't want parishioners getting fat... it made the list because eating like a pig put you and your family in dire risk of starvation over the winter. There are prayers and writings from the time to the effect of "it would be heaven to live in a place where you could eat all you wanted, and not have to worry."

And now we have that. And most people don't give it a second thought.

And the people who STILL don't have that, almost a thousand years later?
This will sound harsh... but help them or not (and blessedly, we do - within the limits of what their culture and ours can handle) - they're hardly in a position to damage the first world.
I don't expect that to change much. Mostly because the technology required for any kind of meaningful life extension also implies a defensive technology to match. If you have machines that can get inside a cell to fix damage, you have machines that can sniff out fissionable material and eat circuit boards to deactivate homemade nukes.

None of which means it would be a wise idea to walk alone though downtown (whatever-that-era's-equivalent-of) Mogadishu showing off your bling, or that there wouldn't be sporadic outbursts of non-state violence here and there.

Again, I truly do foresee an "elves and orcs" (or at least "elves and men") cultural division for a time. How long that lasts depends on how easily "cloudable" the technology can be made, and what political strictures the "elves" of the time may put into place to restrict its development. The classic "it's okay for me and my kind, but I don't want the little people able to afford it." ... and of course assuming there's no Roman-Empire like collapse of the Western world.

Regardless, it's going to be an interesting ride.
The more of this stuff I look at, the more it feels like living in the opening years of the Renaissance. This is gonna be fun. What a fascinating time to be alive. You know, someday there are going to be historical reenactors dreaming about this period and wondering what it was like to live now.



As to the "technological singularity," personally I find it too nebulous a term - seems everyone has a slightly different idea of what exactly it entails, and at times it sounds frustratingly close to magic. Kind of like "nanotechnology." :~

#132 Custodiam

  • Guest
  • 62 posts
  • 3
  • Location:Hungary

Posted 30 October 2009 - 07:45 AM

KalaBeth wrote:

Can you imagine the envy of mortals? Can you predict the rage after some people (possibly hundreds of millions of people) will be denied of eternal life?

Oh, I'm sure the envy (and in some cases rage) would be intense. But see below.

If you have the technology of immortality, to deny it from a human means to kill that human being.

The same could be said of any number of life-sustaining resources. It's a morally admirable thing to donate to a food bank, but you're hardly culpable of murder if you prefer not to spend your entire paycheck on African famine relief.


I think you underestimate the meaning of immortality.

Poor people are relatively passive because they see rich people having the same "low-cost" opportunity to pursue happiness: eating, sex etc. etc..

But also rich people die too. Death is the most democratic "institution".

Immortality means eternal joy or the possibility of eternal joy.

This is not quite like when a mortal let another mortal die.

When everybody can die, death is part of life. You have a right to defend yourself, to finance yourself because your lifetime is limited. Everybody has the right to use their lifetime as they want. At the end, all of us will die.

But if you are an immortal, the whole situation changes.

You are cheating death. The poor can't enjoy life and they will lose everything when they die.

You are ("the rich") enjoying life and will enjoy it possibly forever.

The equation will change dramatically.

You can't dismiss it lightly.

VidX wrote:

I guess some ppl tend to concentrate on obstacles, other on possibilities.. The later usually has a bigger chance to succeed. The whole "why this or that is impossible" seems futile and irrelevant to me. We need to strive to make it possible, how? well we already have a problem, some say it's a half of the solution.


You don't really understand my point.

I WANT the singularity. I WANT immortality. It is GOOD.

But as a Hungarian I lived - as a child - in communism.

So I have an ability to sense a demagogic idea.

Communism was based on very tempting lies. And it wasn't working.

Communism was the promise of a materialistic heaven, when science and communist moral will change the world for the better.

This was the propaganda.

Reality was a dictatorship, cultural shortsightedness, brainwashing, food shortages, mass executions, immense destruction, corruption, primitive sociopaths occupying high positions.

So there is a big difference between ideology and the realisation of an ideology.

If there will be so much difference between the idea and the realisation of the singularity than it was between the idea and the realisation of communism, then the singularity will be hell on Earth.

We should not attempt again an unnatural evolutionary step. It will fail.

Communism killed 100 million people.

But the idea was very tempting.

So we should be very careful.

The Devil is in the details.

Communism, fascism and futurism always walked hand in hand. These ideologies are responsible for tens of millions of deaths.

Edited by Custodiam, 30 October 2009 - 07:50 AM.


#133 VidX

  • Guest
  • 865 posts
  • 137

Posted 30 October 2009 - 04:38 PM

Well actually a lot of ppl die from simple starvation every day, while some 'rich' ones throw out unused food.. I guess we are not talking about some moral ideals or overall utopy. We are talking about a technology. These of us who won't be able to afford it - will die, the same way ppl who can't afford expensive medicine dies, knowing that it exists and is effective. IMHO society got used to the 'unfairness' of life and most would accept the situation as it is, because, don't forget - a big part of society feels helpless, like their lives depend on decisions 'from above' (government or some mysterious forces even they can't name).

Other wise - there is enormous economic profit from having already grown up consumer for indefinite time, so there are an abundance of possible scenarios, the fact is that WE can make it possible in our life times by participating, or even if we can't - it's not like there are many thins to strive for of a similar importance..

#134 VidX

  • Guest
  • 865 posts
  • 137

Posted 30 October 2009 - 04:44 PM

We should not attempt again an unnatural evolutionary step. It will fail.

Communism killed 100 million people.

But the idea was very tempting.

So we should be very careful.

The Devil is in the details.

Communism, fascism and futurism always walked hand in hand. These ideologies are responsible for tens of millions of deaths.





After reading this - I don't really understand what are you doing here. Goal is already SET, there's no way back, and there's nothing really to worry about as the worst possible scenario - u'll eventally die at an old age (due to aging or oh so devil future communists). oh boy, that's so unexpected..lol really, just read yourself again.

p.s.-I live in the post commuist country with all the ugly consequences still around in ppls consciousness. It still doesn't seem so bad as someone who's old and crippled, waiting for his last breath.

Edited by VidX, 30 October 2009 - 04:47 PM.


#135 Custodiam

  • Guest
  • 62 posts
  • 3
  • Location:Hungary

Posted 30 October 2009 - 05:06 PM

VidX wrote:

After reading this - I don't really understand what are you doing here. Goal is already SET, there's no way back, and there's nothing really to worry about as the worst possible scenario - u'll eventally die at an old age (due to aging or oh so devil future communists). oh boy, that's so unexpected..lol really, just read yourself again.


Well, that is not a logical reply, that is an argumentum ad hominem fallacy.

I was kinda expecting this reaction.

Of course I can understand that for you it is very simple - tomorrow you will save the world and those stupid losers against immortality will just die.

It is easy, it is quick - it will happen just like a miracle!

Actually this kind of blind faith is typical of religious and ideological fanatics.

OK, I try to express myself more exactly.

You, and lot of transhumanists have no idea whatsoever how immortality will effect the world.

You just like the idea and you wanna buy your own personal immortality.

It is that simple. FedEx will deliver it in no time, no?

Those stupid losers who don't wanna live forever will just die, no?

Are you serious? Do you really think the world works like that?

This is a very complex question. It has sociological, economic, psychological consequences.

Economists a few years ago wasn't able to predict the world economic crisis!

How can anyone claim that he or she can predict the effects of future technologies?

This kind of confidence is the confidence of arrogance and stupidity.

I'm just trying to warn everybody about fanaticism, blind faith and stupid arrogance.

We should not create a new techno-Taliban generation.

Edited by Custodiam, 30 October 2009 - 05:18 PM.


#136 VidX

  • Guest
  • 865 posts
  • 137

Posted 30 October 2009 - 05:35 PM

Well, that is not a logical reply, that is an argumentum ad hominem fallacy.

I was kinda expecting this reaction.

Of course I can understand that for you it is very simple - tomorrow you will save the world and those stupid losers against immortality will just die.

- nope, more like - I expect a plastic surgery on a cellular level will be available and I'll try to make sure I'll be able to afford it. If I'd care about the poor- I'd already be on a mission in Africa.



It is easy, it is quick - it will happen just like a miracle!

- not at all, I personally do what's in my abilities to help this.

Actually this kind of blind faith is typical of religious and ideological fanatics.

- it's a positive outlook that motivates to at least TRY to do something.

OK, I try to express myself more exactly.

You, and lot of transhumanists have no idea whatsoever how immortality will effect the world.

You just like the idea and you wanna buy your own personal immortality.

- You either. So wouldnt it be sane not to worry about millions of possible scenarios?


It is that simple. FedEx will deliver it in no time, no?

Those stupid losers who don't wanna live forever will just die, no?

Are you serious? Do you really think the world works like that?

- millions already die of simple curable causes.

This is a very complex question. It has sociological, economic, psychological consequences.

Economists a few years ago wasn't able to predict the world economic crisis!

How can anyone claim that he or she can predict the effects of future technologies?

- Have you red at least one book of Ray? It's obvious - you haven't.

This kind of confidence is the confidence of arrogance and stupidity.

- Lol..like we have much choices besides death..

I'm just trying to warn everybody about fanaticism, blind faith and stupid arrogance.

- I can just try to warn you- all this means jack s**t when you are dead. 

We should not create a new techno-Taliban generation.

- then we should halt all the technological progress to avoid any real and imaginary dangers..



I suggest you to look at this as the strive for the best medicine possible. it's already great, but we need more out of it at the moment.

Edited by VidX, 30 October 2009 - 05:39 PM.


#137 Custodiam

  • Guest
  • 62 posts
  • 3
  • Location:Hungary

Posted 30 October 2009 - 06:09 PM


- nope, more like - I expect a plastic surgery on a cellular level will be available and I'll try to make sure I'll be able to afford it. If I'd care about the poor- I'd already be on a mission in Africa.


Did you know that the ancient Romans thought their empire will last forever? How arrogant and naive they were.

Will you be surprised when - and I really hope it never happens - a terrorist nuclear bomb will detonate in a Western city?

Don't you think that we have to pay for our arrogance one day?

- not at all, I personally do what's in my abilities to help this.


You have a tactic but you don't have a strategy. It can be a fatal mistake.

- it's a positive outlook that motivates to at least TRY to do something.


I agree. But I want to do more - I want to have a strategy, not only the fanatical faith in a technological miracle.

- You either. So wouldnt it be sane not to worry about millions of possible scenarios?


Some say that who is not paranoid is not sane nowadays. What a shame it would be to achieve immortality just to die in a global war.

- millions already die of simple curable causes.


Believe me, immortality is a different level. You never saw envy until you see the envy of mortals.

- Have you red at least one book of Ray? It's obvious - you haven't.


I read the "Law of accelerating returns" essay and I know his basic idea. I'm not saying that it is totally false - I hope he is right!

Did you read " The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization" by Thomas Homer-Dixon?

In my opinion the rise and fall of civilisations is inevitable.

It means that any messianic "end of history" is a false promise.

The singularity is just the beginning of a new phase in evolution.

In other worlds the singularity will not change the pattern of evolution, only the scale will grow.


Edited by Custodiam, 30 October 2009 - 06:29 PM.


#138 VidX

  • Guest
  • 865 posts
  • 137

Posted 30 October 2009 - 06:40 PM

Will you be surprised when - and I really hope it never happens - a terrorist nuclear bomb will detonate in a Western city?
Don't you think that we have to pay for our arrogance one day?

- it's a completely different issue(possible aggression from someone), consisting of many elements which are far away from the topic.

You have a tactic but you don't have a strategy. It can be a fatal mistake.

- I have  a strategy(depends what you consider a strategy) sir, I'm still developing it, I'm sure others too.

I agree. But I want to do more - I want to have a strategy, not only the fanatical faith in a technological miracle.

- I call that a possible progress, not miracles. Everything else is a long way of planing,adjusting, depending on the situation. 

ome say that who is not paranoid is not sane nowadays. What a shame it would be to achieve immortality just to die in a global war.

- or not, who knows..

Believe me, immortality is different level. You never saw envy until you see the envy of mortals.

- You may be right on this one. I guess we'll have another '3rd world', as there's no way the technology will be available for EVERYONE.

Did you read " The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization" by Thomas Homer-Dixon?
[/size][/font][/font]

- Nope, I should I guess. Though Imho these 'falls' became gradually 'softer' as centuries went by. It's natual and expected - from complete 'fool' human became a lil wiser, and that matters.

Edited by VidX, 30 October 2009 - 06:44 PM.


#139 Custodiam

  • Guest
  • 62 posts
  • 3
  • Location:Hungary

Posted 31 October 2009 - 08:24 AM

- it's a completely different issue(possible aggression from someone), consisting of many elements which are far away from the topic.


No, it is not a different issue. Blind selfishness and greed of immortalists can trigger a revenge, possibly a global payoff.

It can be the straw that broke the camel's back.

- I have a strategy(depends what you consider a strategy) sir, I'm still developing it, I'm sure others too.


I'm very sorry to say but you have to wake up first. You are living out of sync with global and historic reality.

- I call that a possible progress, not miracles. Everything else is a long way of planing,adjusting, depending on the situation.


Frank J. Tipler's idea of an Omega-point which can save the consciousness of every conscious being ever existed is also a "possible progress".

He proves with scientific means that a techno-God is a scientific possibility - this God is the ultimate transhumanist civilisation.

There are nice ideas out there. I'm a little bit sceptical about them.

I can advise the same for everyone.

- or not, who knows..


Do you like Russian roulette? I don't.

- You may be right on this one. I guess we'll have another '3rd world', as there's no way the technology will be available for EVERYONE.


There is only one true solution. A global trust for those who wanna achieve immortality. A kind of global immortalist health plan and fund. I don't think if it is open, there would be too many participants. But if there are many participants, it can generate immense amount of funds.

- Nope, I should I guess. Though Imho these 'falls' became gradually 'softer' as centuries went by. It's natual and expected - from complete 'fool' human became a lil wiser, and that matters.


The pattern of evolution absolutely makes it unlikely that the singularity will be the end of history.

It is just a new evolutionary step.

Aggression and violence will not end, because until there is competition, violence will remain.

***
On the other hand, the single most problematic aspect of Ray Kurzweil's idea is the strong AI.

As I wrote earlier, because of Gödel's theorems an algorithmic AI can never surpass a conscious intelligence. It is proven.

We have currently no idea how can a non-algorithmic AI work, apart from our brains.

Edited by Custodiam, 31 October 2009 - 08:58 AM.


#140 VidX

  • Guest
  • 865 posts
  • 137

Posted 31 October 2009 - 09:31 AM

 

No, it is not a different issue. Blind selfishness and greed of immortalists can trigger a revenge, possibly a global payoff.

It can be the straw that broke the camel's back.

- well, it's not like somebody will have an access y default, you'll be able to work and earn for it, or sell everything you have, so I don't really see the enormous greed. Not to mention illegal LE services for a cheaper price. I'm sure it will be 'just' another way of business, very very huge one.

I'm very sorry to say but you have to wake up first. You are living out of sync with global and historic reality.

- And why is that so? I'm uncertain and you are clearly pessimistic.

Frank J. Tipler's idea of an Omega-point which can save the consciousness of every conscious being ever existed is also a "possible progress".

He proves with scientific means that a techno-God is a scientific possibility - this God is the ultimate transhumanist civilisation.

There are nice ideas out there. I'm a little bit sceptical about them.

I can advise the same for everyone.

- I'm not into that kind of overhelming utopy ideas, I'm looking at it as a natural and VERY expected outcome of the evolution course. Actually every time humans life became easier it was due to technology, starting from invention of fire. It all has it's negative sides, but to completely underestimate it - thats not wise at all.

-
Do you like Russian roulette? I don't.

- Don't really think we have a choice. And it's really ironic - the worst case scenario - death, lol.. I'll better take a chance at that roulette, as there's not much time to ponder for a perfect solution. It's how you look at things I guess.. 

There is only one true solution. A global trust for those who wanna achieve immortality. A kind of global immortalist health plan and fund. I don't think if it is open, there would be too many participants. But if there are many participants, it can generate immense amount of funds.

- I'm not sure if its the ONLY one solution, but seems like a decent base of an idea. You can be sure - there will be an enormous amount of a global debate when/if we'll achieve this, we are already one global techno society afterall, more then we think probably.

The pattern of evolution absolutely makes it unlikely that the singularity will be the end of history.

- End of the history would be the collapse of the universe lol.. Singularity may be a new kind of civillisation or something like that. 

Aggression and violence will not end, because until there is competition, violence will remain.

- Well, maybe AI will help us to solve this out, if not - we'll have to learn how to live with it/control it.. There are just so many possible outcomes,it'd be just silly to present everything as just black or white.





As I wrote earlier, because of Gödel's theorems an algorithmic AI can never surpass a conscious intelligence. It is proven.

- Have you heard "numenta"? It seems brain isn't that complex afterall,and there's so much more to find out, I'm not even talking about bio computing, which is taking off currently. And yeah - we can see obstacles or we can just keep searching the way around them..

#141 Berserker

  • Guest
  • 60 posts
  • 0

Posted 31 October 2009 - 12:28 PM

Just one think to consider, in Africa, in many countries the life expectancy is around 40 years...in other develop countries is almost 80.
So, it’s twice, and there is not a world war. As Vidx says, the people accept the situation. What if in the next years we can increase the life expand to 120?

Because i think that huge life extensions will can eventually. Firs we may get 5-10 years life extensions, then 20 years, then maybe 50,etc...i mean, is going to be a big process with many steps, not juts one issue and i think that people will just accept the situations, as same Africans countries do today. If you think about we are already living in a society wilt an extension of life expectancy.

#142 VidX

  • Guest
  • 865 posts
  • 137

Posted 31 October 2009 - 12:58 PM

Agreed + as this most probably will happen in small steps - there will be enough time to adapt the global perception. Hell,it's already happening, advanced pl.surgery, hormone replacement, organ/tissue growth in a lab..fascinating things, really, and we already accept this with "Meh.." attitude..

#143 atp

  • Guest
  • 138 posts
  • 16

Posted 27 December 2009 - 01:09 AM

Well my problem with Moore's law argument is that my old Commodore 64 computer 25 years ago had the same type of AI than my multi-core PC has now.

To be honest, PCs and softwares are algorithmic logical tools and not AIs in my opinion.

I don't find my PC more intelligent now than my C-64 was 25 years ago.

It is much quicker and can store much more data, but is it really intelligent?


it is still not human like intelligent.

but intelligence is much broader defined.

if you have doubts with moore's law, watch this (30 years progress in video games)





and now imagine what we can expect in biotech 30 years from now.

#144 karl_bednarik

  • Topic Starter
  • Guest
  • 436 posts
  • 105
  • Location:Wien, Oesterreich (Vienna, Austria)

Posted 27 December 2009 - 03:32 AM

Das ist völlig richtig.

1964 arbeitete ich mit Rechenschieber, Logarithmentafeln, Millimeterpapier, Kurvenlineal, Bleistift und Radiergummi.

Heute gibt es EXCEL und 3 Gigahertz-Prozessoren, und die einzige Frage lautet: "Welche Farbe darf die Kurve haben?"

Wir müssen nur die biologischen Funktionen der menschlichen Körper- und Gehirnzellen simulieren können.

3 Milliarden Basenpaare sind 6 Gigabyte, dazu noch die Wechselwirkungen von 30000 Proteinen, also garantiert viel weniger als 900000000 = 0,9 Milliarden Wechselwirkungen.

Die Simulation der Informationsverarbeitung im menschlichen Gehirn ist nicht direkt für ein längeres Leben notwendig.

100 Milliarden Gehirnzellen mit je rund 1000 Synapsen, das sind 10 hoch 14 synaptische Übertragungsfaktoren mit deutlich mehr als 2 möglichen Werten.

#145 karl_bednarik

  • Topic Starter
  • Guest
  • 436 posts
  • 105
  • Location:Wien, Oesterreich (Vienna, Austria)

Posted 27 December 2009 - 05:52 AM

Upps:

3 Milliarden Basenpaare sind 6 Gigabit oder 0,75 Gigabyte.

#146 Solarclimax

  • Guest
  • 209 posts
  • -62

Posted 06 March 2010 - 01:59 PM

Could our software possibly still be so lousy by then that we won't be able to simulate a human brain?

Yes.


No, because computers capable of such power, would be able to help us get the software right.

Edited by Solarclimax, 06 March 2010 - 02:00 PM.


#147 Luna

  • Guest, F@H
  • 2,528 posts
  • 66
  • Location:Israel

Posted 06 March 2010 - 07:39 PM

Could our software possibly still be so lousy by then that we won't be able to simulate a human brain?

Yes.


No, because computers capable of such power, would be able to help us get the software right.


It usually works the other way around..

#148 Solarclimax

  • Guest
  • 209 posts
  • -62

Posted 06 March 2010 - 07:45 PM

Could our software possibly still be so lousy by then that we won't be able to simulate a human brain?

Yes.


No, because computers capable of such power, would be able to help us get the software right.


It usually works the other way around..


That's because we don't have such powerfull computers to help us out. Kinda like we drive the car, because we don't have such powerfull car that can drive itself, yet. Programs already exist that can show us how to do things better.

Edited by Solarclimax, 06 March 2010 - 07:49 PM.


#149 Luna

  • Guest, F@H
  • 2,528 posts
  • 66
  • Location:Israel

Posted 07 March 2010 - 10:34 AM

Could our software possibly still be so lousy by then that we won't be able to simulate a human brain?

Yes.


No, because computers capable of such power, would be able to help us get the software right.


It usually works the other way around..


That's because we don't have such powerfull computers to help us out. Kinda like we drive the car, because we don't have such powerfull car that can drive itself, yet. Programs already exist that can show us how to do things better.


That thing that's supposed to drive the car is software not hardware, if you had much better hardware and similar software it's just more likely to hit a wall faster.

#150 Solarclimax

  • Guest
  • 209 posts
  • -62

Posted 07 March 2010 - 03:06 PM

Could our software possibly still be so lousy by then that we won't be able to simulate a human brain?

Yes.


No, because computers capable of such power, would be able to help us get the software right.


It usually works the other way around..


That's because we don't have such powerfull computers to help us out. Kinda like we drive the car, because we don't have such powerfull car that can drive itself, yet. Programs already exist that can show us how to do things better.


That thing that's supposed to drive the car is software not hardware, if you had much better hardware and similar software it's just more likely to hit a wall faster.


Don't argue with me i know origami.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users