• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo

Singularity Institute Flyer 2


  • Please log in to reply
No replies to this topic

#1 Anand

  • Guest Singularity
  • 5 posts
  • 0

Posted 04 September 2002 - 02:21 AM


Singularity Institute Flyer 2

© 2002 by Eliezer Yudkowsky

institute@singinst.org

www.singinst.org

404-550-3847

Permission granted to redistribute


“Here I had tried a straightforward extrapolation of technology, and found myself precipitated over an abyss. It's a problem we face every time we consider the creation of intelligences greater than our own. When this happens, human history will have reached a kind of singularity - a place where extrapolation breaks down and new models must be applied - and the world will pass beyond our understanding.”
—Vernor Vinge, True Names and Other Dangers

The amount of computing power a dollar will buy doubles every eighteen months. It's been doubling for the last fifty years. Today's desktop computer still has only a fraction of the computing power inherent in a single human brain, but the gap is quickly closing. Once the hardware (and, perhaps more difficult, the software) for real Artificial Intelligence exists, intelligence smarter than our own is only a few steps away. The number of neurons in a human brain hasn't changed in the last fifty thousand years; the speed of an individual neuron hasn't increased in millions of years. By comparison, the speed of an individual transistor doubles every two years, and once an AI exists it can run on ten or a hundred times as much hardware. Unlike us, an AI could modify, optimize, and redesign itself, continually improving its abilities. Only a short distance would separate real AI from minds vastly smarter and vastly faster than our own. Only a short distance would separate real AI from Vernor Vinge’s Singularity.

Someone ought to be paying attention.

The Singularity Institute for Artificial Intelligence, Inc. is a 501©(3) nonprofit solely and directly devoted to a swifter and safer Singularity. We're interested in AI capable of self-understanding, self-modification, and recursive self-improvement ("seed AI"), and in AI which can absorb the cognitive processes underlying human altruism ("Friendly AI"). We've published original work in the theory of self-improving architectures, general intelligence, and the first-ever specific proposal for implementing Friendly AI. We run the open-source project to implement the Flare programming language, and are seeking funding to launch our main AI project.

What is “Seed AI”?

Seed AI is an AI that has been designed for self-understanding, self-modification, and recursive self-improvement. A successful seed AI could improve its own source code, gain a higher level of intelligence from those improvements, and continually make new rounds of improvements with higher levels of intelligence, thus reaching human-equivalent intelligence and beyond. (This may not even require human-equivalent intelligence to get started; human programmers have a visual cortex, auditory cortex, sensorimotor cortex, and so on, but we have no sensory modality for code...)

For more about seed AI, please visit: www.singinst.org/seedAI

What is “Friendly AI”?

While much has been said about the need for benevolent AI, little has been said about how to implement it, prior to the Singularity Institute’s work. We recommend a set of cognitive architectures that should enable an AI to learn about Friendliness, to recover from mistakes made by the programmer in providing information about Friendliness, to acquire the cognitive complexity that humans use to think about goals, and to become human-equivalent or human-surpassing in the domain of altruism.

For more about Friendly AI, please visit: www.singinst.org/friendly

What is the “Singularity”?

The Singularity is the moment in human history marked by the technological creation of greater-than-human intelligence. For the past fifty thousand years, the cognitive performance of human brains has remained constant. Improve the brainware, and all the rules change; our old models break down and new ones must be applied. The Singularity Institute is presently the only nonprofit devoted solely to the Singularity. Because the Singularity is a global event with far-reaching consequences, we consider any success in achieving a swifter and safer Singularity to be worth every responsible effort that we can make.

For more about the Singularity, please visit: www.singinst.org/intro.html




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users