• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
- - - - -

Full-fledged nanotech will be here shortly


  • Please log in to reply
20 replies to this topic

#1 MichaelAnissimov

  • Guest
  • 905 posts
  • 1
  • Location:San Francisco, CA

Posted 04 June 2004 - 01:12 AM


Full-blown molecular manufacturing is right around the corner. Sometime between 2005 and 2020, new manufacturing technologies will be created with practical effects exceeding those of ten Industrial Revolutions. Products manufactured by molecular machinery, from the atoms up, each piece put into its precise place.

Some minor points about nanotechnology:

- Going from a basic assembler to a nanofactory of practically arbitrary size is easy.
- Nanotech products will be diamond. A single desktop nanofactory can create enough diamond to crash the gem market within hours.
- Every present-day industry will be displaced by nanotechnological equivalents. This will happen over years or months, not decades.
- Non-nanotech companies will not be able to compete economically with nanotech companies.
- Non-nanotech powers will not be able to compete militarily with nanotech powers.
- Nanotech products will cost as much as their raw materials.
- Design of new products will be CAD-aided and easy for anyone.
- Massive life extension will quickly become feasible.
- Many huge developments could literally happen overnight.

If you can spare an hour or two of time, I strongly recommend looking into the following:

Thirty Essential Nanotechnology Studies

Nanotechnology is one of those subjects that 99% of transhumanists think they understand, but don't. Doing the reading is the only way to overcome that lack of understanding.

Here are two short papers I've written on the subject:

More Dangers from Nanotechnology
My Position on Nanotechnology Administrative Policy

#2 lightowl

  • Guest, F@H
  • 767 posts
  • 5
  • Location:Copenhagen, Denmark

Posted 04 June 2004 - 01:42 AM

Yes, I believe this is true. Nanotechnology is going to change the world, and it is not so far fetched as many people think. Massive advancements are happening right now. Investors are lining up. Products are getting ready for the market. Actual products are already being sold to the general public.

But one problem still persists. Ask 100 people and only 10 people have a close idea to what nanotechnology is, and they all have a different answer. What is alarming is the fact that almost none of them realize the changes this technology is going to induce.

People need to wake up.

Even more alarming is the ignorance still residing in the legislative administrations. Most of them know it is coming, but they don't fully realize the potential of this disruptive technology. We are living in a pre-nano age. The step into the nano-age is small and we are getting ready to jump it. Some people are already living in this new age just waiting for it to happen.

Many discussions has been held, but only in closed or enlightened groups. I doubt the general public will have time to consider the possibilities before they are emerged in rapid innovations. Unable to keep up with this revolution many will be condemned to cheep-hood. A consumer generation not having a clue about how things work.

The time has come for everyone to pick the people you hold dearest and open their eyes, for the steep curve is getting near. Evolution is taking the next step and you better sit tight or you will fall of when the train starts accelerating.

#3 Jay the Avenger

  • Guest
  • 286 posts
  • 3
  • Location:Holland

Posted 04 June 2004 - 11:30 AM

A nice link to go along with this kickass topic: http://www.kurzweila......html?id=3395

Study: Self-replicating nanomachines feasible

Small Times, June 2, 2004


A useful self-replicating machine could be less complex than a Pentium IV chip, according to a new study of of "kinematic cellular automata" performed by General Dynamics for NASA.

Through simulations, the researchers demonstrated the feasibility of this kind of self-replication, which could in a decade or more lead to the mass manufacture of molecularly precise robots, display monitors and integrated circuits that can be programmed in the field, the study said.



sponsored ad

  • Advert

#4 bacopa

  • Validating/Suspended
  • 2,223 posts
  • 159
  • Location:Boston

Posted 05 June 2004 - 08:50 AM

Thanks for the links and your insight Michael.
Quite scary stuff. From CRN's article Dangers of Molecular Manufacturing a couple of things interested me. Excuse me for over quoting but I'd like to break this one down...

MNT will reduce economic influence and interdependence, encourage targeting of people as opposed to factories and weapons, and reduce the ability of a nation to monitor its potential enemies.

This is scary to think that humans will be the most sought after target if MNT reduces the importance of reliance on material goods. If self replicating nanomachines can produce indiscrimanetly whatever, whenever they want I can duly see how the focus of warring nations will target people instead of material goods. Perhaps on a positive note this will discourage criminal and war like behavior if nations no longer fight over material goods...

As many as 50 billion toxin-carrying devices—theoretically enough to kill every human on earth—could be packed into a single suitcase. Guns of all sizes would be far more powerful, and their bullets could be self-guided. Aerospace hardware would be far lighter and higher performance; built with minimal or no metal, it would be much harder to spot on radar. Embedded computers would allow remote activation of any weapon, and more compact power handling would allow greatly improved robotics. These ideas barely scratch the surface of what's possible.

I can't even begin to imagine what horrors this technology will allow rouge terrorists groups to harbor. But the potential for postive use could far out weigh the negative, certainly ethics will remain a top issue of importance when nanotech becomes part of the arms race as this article talks about...

All-out nanotech war is probably equivalent in the short term, but nuclear weapons also have a high long-term cost of use (fallout, contamination) that would be much lower with nanotech weapons. Nuclear weapons cause indiscriminate destruction; nanotech weapons could be targeted. Nuclear weapons require massive research effort and industrial development, which can be tracked far more easily than nanotech weapons development; nanotech weapons can be developed much more rapidly due to faster, cheaper prototyping. Finally, nuclear weapons cannot easily be delivered in advance of being used; the opposite is true of nanotech.

I also fear the consequences if we start using this kind of weaponry against our own people, if terrorist groups within the U.S. wish to eliminate what they see as potential rival threats. I shudder to think that someone with a revenge scheme could use Nanotech to kill off someone without anyone finding out.

It may also, by enabling many nations to be globally destructive, eliminate the ability of powerful nations to "police" the international arena. By making small groups self-sufficient, it can encourage the breakup of existing nations.

is in keeping with my fears...overall the dangers of nanotech seem potentialy devastating no wonder why CRN takes the time to map out the potential problems of Nanotech manufacturing, I'd imagine many philantropist groups and environmental groups will strongly object to nanotechnology when they read the potential dangers...

#5 bacopa

  • Validating/Suspended
  • 2,223 posts
  • 159
  • Location:Boston

Posted 05 June 2004 - 11:31 AM

From Michael's article...

What about when it becomes possible to use gene therapy or hormones to compress the interval between childhood and adulthood? (Advances in robotics and new building methods could allow the creation of automated superstructures that produce millions of new humans per year. Other science fictional-sounding scenarios are plausible. CRN tends to hint towards these scenarios only vaguely, no doubt to avoid sounding incredulous, but it is very important to be aware of them.

This is something I hadn't thought of but reading it now sounds completely plausible that some unethical types might use these nano-factories to mass produce people, and mess around with the ageing process in an unethical way of course. I wonder if people will use nano factories to mass produce people for military purposes. Of course bio-engineering is a hot topic but the problems inherent with that seem to be exacerbated when factoring in nanotech.

The creation of transhuman intelligence could entail the rapid disintegration of a carefully constructed network of constraints and safeguards on nanomanufacturing capabilities.

Interesting connection you made here between transhuman, smarter than human intelligence, and the abiltiy to hack into nanomanufacturing safe guards. Certainly the structure of society as a whole will inevitably have to change when transhuman intelligence comes into play, I'd imagine any non-augmented human would feel sufficiently stupid trying to do anything worthwhile surrounded by superior intelligence modified humans. Therefore as Kurzweil mentions in The Age of Spiritual Machines, the non-augmented humans will have to augment themselves to keep up and the people in charge of the nanomanufacturing better augment themselves to come up with even higher saftey measures with regards to nanomanufacturing saftey precautions.

Enhanced humans will quickly create unprecedented effects in economic, social, scientific, and military spheres.

Telepresence, coupled with powerful robotics and sophisticated interfaces that implement commands based on simple gestures, will permit the development of "nano-wizardry" - individual soldiers with sufficient capability to destroy, subvert, or torture entire armies or cities. Independent human flight will become possible with a minimum of aerospace hardware. Reprogrammable phased-array optics allow complete invisibility. Perfect surveillance, neurological enhancements, responsive environments, smart materials, and so on.

I just hope that nanotech is not exploited by the military but I somehow know it will be. This is what angers me about these new technologies is that they obviously get used, imo, for non-productive measures, even destructive measures. This is why greater than human intelligence may be the only solution to actually harbor enough wisdom to show true compassion and not human level cruelty and iresponsibility, you may agree with this sentiment.

Many religions and other belief systems will be ruined.

Nanotechnology will make it feasible to reproduce many classes of Biblical miracles. Humans enhanced with nanoengineered body parts and telerobotic control interfaces may have angel-like or even god-like capabilities. Nanotechnologically facilitated approaches to life extension will rapidly allow the abolition of death. Work will no longer be necessary. Etc.

yay! but somehow God will creep in there somehow, people will probably find even more reason to believe in the divine for how else could such an amazing technology just 'come to be?'

To avoid the negative impact of grey goo, green goo, nano-litter, human rights disasters, nano-wizardry, economic and social upheaval, arms races, and other unforseen risks will require true superintelligence, nothing less. Once created, superintelligence will compound upon itself rapidly, resulting in the creation of agents with deity-class capabilities

You don't have much faith in human nature, well neither do I, the more I think about it the more I realize the true importance of super intelligence in the form of benevolent AI will be, I wish more people would see this. Not enough transhumanists seem to value the importance of 'kinder than human' superintelligence, they seem to forget human irrationality stemming from 'human level' thinking. Human beings, in my opinion, are grossly incapable of making ethical decisions much of the time, especially with regards to Nanotech this could be even worse, and that is one of the reasons I value transhumanist thought so much, but until I was exposed to the Singularity, I never would have thought that people would think to create benevolent AI, a truly amazing idea... I find it interesting how you've taken a pessimistic stance on human nature and put a positive spin on it. Most people can't look at human imperfection square in the eye let alone figure out a potential solution.

#6 MichaelAnissimov

  • Topic Starter
  • Guest
  • 905 posts
  • 1
  • Location:San Francisco, CA

Posted 05 June 2004 - 11:34 PM

Devon, thanks for your thoughtful comments, I concur with everything you've said. Accelerating the creation of favorable technologies, most notably benevolent superintelligence, is of utmost urgency. I am not "pessimistic" about human nature per se - just realistic. If our world is saved by the right people making the right choices, then that will be a testament to the positive side of human nature. I'm not sure why 99% of transhumanists largely ignore the prospect of benevolent superintelligence, I see it as the only escape hatch from the human-insurmountable challenges just around the corner. My guesses are as follows:

1) Most transhumanists feel that faster-than-human intelligence will be limited to influencing the world at human-equivalent speeds.

2) Most transhumanists feel that smarter-than-human intelligence will be limited by slow human infrastructures. (Was Homo sapiens limited by Neanderthal infrastructures?)

3) Most transhumanists feel that duplicating general intelligence won't be feasible for many decades, despite what we currently know about cognitive science and the extremely large quantities of computing power nanotech will make possible.

4) etc?

This large gap of opinion on the issue of kinder-than-human superintelligence between myself and many other transhumanists is very disturbing to me. How do other transhumanists expect to survive nanotech, much less the arrival of human-indifferent superintelligence?

#7 lightowl

  • Guest, F@H
  • 767 posts
  • 5
  • Location:Copenhagen, Denmark

Posted 06 June 2004 - 12:27 AM

How do other transhumanists expect to survive nanotech, much less the arrival of human-indifferent superintelligence?

I don't expect to survive the coming of super-intelligence in my current form. I hope there will be enough time to augment both body and brain before the intellectual evolution takes over completely. By that time, again, I am hoping constant upgrading of my mental and physical systems will help me cope with the acceleration of complexity the future will almost certainly bring.

I agree with many that time is coming for posthumans and "artificial" beings to spread out in the universe. I predict a vast amount of different beings will start populating our entire solar system. This will be a necessity step for complexity to expand. I think a complete collapse of the current social system is unpreventable. The only solution is a major upgrade and adaptation to future conditions.

In my opinion wars and terror will not end with the super-intelligent beings taking over. The galaxy will, with time, probably be filled with many different societies living in their own ways. Conflict will almost certainly reside. Unless one race or society becomes all powerful and has the ability to control the entire galaxy, which is a great task indeed.

I suppose I am pessimistic in my views on the future life structures, but I also believe there will be room for "perfect" societies. Places where beings who want to live life in peace and prosperity will seek to be. Those places would probably have the fastest progress and very good chances to protecting them selves from rouge beings and space pirates.

#8 bacopa

  • Validating/Suspended
  • 2,223 posts
  • 159
  • Location:Boston

Posted 06 June 2004 - 01:16 AM

1) Most transhumanists feel that faster-than-human intelligence will be limited to influencing the world at human-equivalent speeds.

Obviously with the vast speeds of which they will be able to operate time frames btween AI and humans will be completely off as you've pointed out on several posts. Certainly many transhumanist, including myself, can't always get past anthropocentric thinking which is too bad because we are only one form of intelligence and as you said equivelant to .0000000001% of total intelligence possible. Once people can get over their anthro ego's the acceptance of SI will finally come to fruition I guess...

This large gap of opinion on the issue of kinder-than-human superintelligence between myself and many other transhumanists is very disturbing to me. How do other transhumanists expect to survive nanotech, much less the arrival of human-indifferent superintelligence?

yes and after reading CRN's warnings of potential nano threats I can see the reason for you being disturbed...People have to start looking at the specific truths of the situation and step down from fantasy sci-fi land, this bugs me.

#9 Mind

  • Life Member, Director, Moderator, Treasurer
  • 19,036 posts
  • 2,005
  • Location:Wausau, WI

Posted 06 June 2004 - 02:18 AM

Before I say the following, I want to let everyone know that I believe in free will...and the only thing I can be sure of in the world is what my senses currently feed my mind. Therefore, I am working to make the world the best place it can be.

This is a good time to think about John Doe's article - The Transhuman Condition. If a benevolent super intelligence is developed, we may never know. Even the creators of this SI may never know if they were successful. Many days down the road, when we are all sitting around on a beach in the sunshine pondering how the world became so peaceful and perfect all-of-a-sudden, someone might say "maybe we are being guided/ruled/manipulated by an SI". Maybe we already are.

#10 lightowl

  • Guest, F@H
  • 767 posts
  • 5
  • Location:Copenhagen, Denmark

Posted 06 June 2004 - 02:32 AM

Yes Mind, its a good point. Some people think that SI is God. I certainly don't. I think if I where to find myself in a perfect paradise all of a sudden, I would wonder what had happened. Also, I would quite possibly feel a bit powerless not knowing what controlled me, and what I was allowed to do. You are right. It is a matter of free will. But how would we know if we still had free will, if the laws of nature was altered? We would still have free will within the confinement of an altered universe. If something was impossible we could only blame the laws of nature.

#11 bacopa

  • Validating/Suspended
  • 2,223 posts
  • 159
  • Location:Boston

Posted 06 June 2004 - 03:17 AM

And I want to let everyone know that I believe in free will and determinsim I'm a compatabilist I believe the two can go hand in hand and work together, and I'm ok with that. I think some things we have control over and other things we don't...but as Paul Hughes mentioned in his metaprogramming and super freewill entry, through the cracks of hard determinism we can find metaprogram and thus attain some degree of free will, and once we realize that we are just programs being run by higher level metaprograms we can reprogram our thought patterns and adjust accordingly, override the programmed responses and thus attain what he defined as 'super free will' also described by John Lilly, psychadelic pioneer of human cognition. I think that classical physics can and probably does work hand in hand with quantum indeterminism and fuzzy logic allowing for freedom through the cracks of hard deterministic motivators. And with the oncoming of SI I firmly believe that free will can and will be fully realized once we tweak a few hard wired deterministic thought patterns which will be a blip in the history of our once predetermined programmed thinking. So I too am optimistic. :)

#12 Jay the Avenger

  • Guest
  • 286 posts
  • 3
  • Location:Holland

Posted 04 December 2004 - 06:53 PM

- Nanotech products will cost as much as their raw materials.


Don't you think the people who designed these products will want to receive some money for their efforts.

Basically, you'd be paying for the information behind the product itself. Many products (especially technological equipment) today already have a high price just because of the fact that people were needed to design them in the first place. The materials that a CPU consists of is worth a few cents, tops.

#13 apocalypse

  • Guest
  • 134 posts
  • 0
  • Location:Diamond sphere

Posted 03 April 2005 - 02:13 AM

This large gap of opinion on the issue of kinder-than-human superintelligence between myself and many other transhumanists is very disturbing to me.  How do other transhumanists expect to survive nanotech, much less the arrival of human-indifferent superintelligence?-MichaelAnissimov

By attempting to be directly involved in the creation of the first one(thus ensuring that's not the case)... [thumb]

Unless one race or society becomes all powerful and has the ability to control the entire galaxy, which is a great task indeed.-lightowl

A headstart... a sufficient headstart is all that is needed, the first superintelligence, if it has enough of a headstart will be capable of bringing about the most efficient and powerful self-replicating substrate that can possibly be built. With it exponential growth will ensure it can't be catched-up, all energy/matter resources could very well eventually be amassed by the first, which would then manage them and distribute them accordingly. Without access to resources, no resistance can be built [lol]

edited

#14 Karomesis

  • Guest
  • 1,010 posts
  • 0
  • Location:Massachusetts, USA

Posted 03 April 2005 - 03:24 AM

I don't know where to start. If the assumption holds correct in due time, the potential dystopian outcomes, as well as the utopian, and there is no strong refutation of this argument; then we must inform all we can as to the situation at hand. A mass prosletyzation if you will, Of transhuman as well as posthuman ideologies. Now we enter the second half of the chessboard, this is where it gets really fun. [lol]

#15 vortexentity

  • Guest
  • 243 posts
  • 1
  • Location:Florida

Posted 03 April 2005 - 05:48 AM

Along these lines I just finished reading John Robert Marlow's book Nano, and he addresses most of the themes and ideas in this thread. While he does become a bit of a sensationalist and kind of throws the second law of thermodynamics out the window from the start of this book It does serve to consider most of your concerns and hopes for this technology. See www.johnrobertbarlow.com for his book.

My greatest concern is that people will fool around and find ways to make nanotech manipulate DNA. This could be very bad as if so much as a cosmic ray strikes the program circuit of one of these tiny machines and we could find ourselves faced with a run away situation.

I am less concerned about someone hacking their program safeguards than the accident that can not be forseen taking place. If a device can dissassemble atoms and replicate itself out of the raw materials then the potential for good is far too great to not develop the technology. If the device errors and keeps taking things apart that it is not supposed to we could have some real trouble that would become very hard to fix very quickly.

It is like looking out over the edge of the abyss and seeing heaven on the other side but you must first cross the most dangerous place in human history to reach it.

We will soon be at that place I am afraid. It will take more concentrated goodness and enlightenment to see this through than I think all of the mainly atheist science community can muster. Most scientist develop technology not to help mankind with their genius but to simply show others what their brilliant minds can do. It is this attitude that is most dangerous when it concerns nano tech and bio-engineering.

[:o]

#16 th3hegem0n

  • Guest
  • 379 posts
  • 4

Posted 05 April 2005 - 11:49 PM

I am less concerned about someone hacking their program safeguards than the accident that can not be forseen taking place


If you are concerned about the misuse, accidents, or other potential problems of nanotechnology, as any informed individual would be, i would suggest checking out www.singinst.org, specifically http://www.singinst.org/CFAI/

#17 lightowl

  • Guest, F@H
  • 767 posts
  • 5
  • Location:Copenhagen, Denmark

Posted 06 April 2005 - 12:15 AM

Ditto for Center for Responsible Nanotechnology.
http://www.crnano.org/

#18 treonsverdery

  • Guest
  • 1,312 posts
  • 161
  • Location:where I am at

Posted 10 May 2005 - 08:41 PM

I like nanotechnology.

#19 justinb

  • Guest
  • 726 posts
  • 0
  • Location:California, USA

Posted 24 June 2005 - 08:30 AM

Average IQ of AI researchers

What scale is he refering to? Standard Deviation of 24?

#20 A941

  • Guest
  • 1,027 posts
  • 51
  • Location:Austria

Posted 29 June 2005 - 06:19 AM

You don't have much faith in human nature, well neither do I, the more I think about it the more I realize the true importance of super intelligence in the form of benevolent AI will be, I wish more people would see this.


What if the AI becomes not so benevolent, for example ... like Sky Net [sfty]
I too have not much faith in human nature but also i have not much faith in a human made AI.

#21 Jay the Avenger

  • Guest
  • 286 posts
  • 3
  • Location:Holland

Posted 03 July 2005 - 11:16 PM

Indeed, MM seems to be right around the corner. We will have it within a decade, if Tihamer Toth-Fejel has anything to say about it:

http://www.nanomagaz...ihamertothfejel

He's working with Chris Phoenix of CRN at the moment.

More interesting interviews on the main page:

http://www.nanomagazine.com/




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users