• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
- - - - -

Supercomputers are about to get a lot more super


  • Please log in to reply
7 replies to this topic

#1 Live Forever

  • Guest Recorder
  • 7,475 posts
  • 9
  • Location:Atlanta, GA USA

Posted 26 June 2006 - 03:52 PM


Link to article

Supercomputers are about to get a lot more super
ROBERT S. BOYD
Knight Ridder Newspapers

WASHINGTON - The federal government is pushing computer scientists and engineers to greatly step up the speed and capacity of America's supercomputers.

Officials say much faster performance is needed to handle a looming tidal wave of scientific, technical and military data.

Powerful new telescopes, atom-smashers, climate satellites, gene analyzers and a host of other advanced instruments are churning out enormous volumes of computer bytes that will overwhelm even the swiftest existing machines.

In the next five years, the government's goal is a computer system that can process at least a quadrillion (a million times a billion) arithmetic operations per second. The best current machines operate in the trillions (a thousand times a billion) of calculations per second.

"Within the next five to 10 years, computers 1,000 times faster than today's computers will become available. These advances herald a new era in scientific computing," according to Raymond Orbach, undersecretary for science at the Department of Energy.

A quadrillion-rated computer, known technically as a "petascale" system, will be at least four times faster than today's top supercomputer - IBM's BlueGene/L - which holds the world's record at 280 trillion operations per second.

"Peta" is the prefix for a quadrillion in the metric system. "Tera" stands for a trillion, so BlueGene is a terascale system.

On a more familiar level, a petascale computer will be at least 75 times faster than the most powerful game machine, such as IBM'S XBox-360, and 100 times faster than a top-of-the-line desktop personal computer, such as the Apple Power Mac.

On Tuesday, RIKEN, a Japanese research agency, announced that it had built a computer system that theoretically can perform 1 quadrillion operations per second. If so, this would be the world's first true petascale computer.

Henry Tufo, a computer scientist at the University of Colorado, Boulder, who operates a BlueGene/L system, said it would take petascale computer power to solve problems that stump present-day systems.

"One of the most compelling and challenging intellectual frontiers facing humankind is the comprehensive and predictive understanding of Earth and its biological components," Tufo said in an e-mail message. "Petascale systems will open up new vistas (for) scientists."

To meet this goal, the National Science Foundation asked researchers June 6 to submit proposals to develop the infrastructure for a petascale computing system to be ready by 2010.

As examples of difficult questions that only a petascale system could handle, the NSF listed:

• The three-dimensional structure of the trillions of proteins that make up a living organism. Proteins are the basic building blocks of all living things.

• The ever-changing interactions among the land, ocean and atmosphere that control the Earth's maddeningly complex weather and climate systems.

• The formation and evolution of stars, galaxies and the universe itself.

The Department of Energy also is offering $70 million in grants for teams of computer scientists and engineers to develop petascale software and data-management tools.

"The scientific problems are there to be solved, and petascale computers are on the horizon," said Walter Polansky, senior technical adviser in the DOE'S Office of Advanced Scientific Computing.

For example, the Energy Department wants ultra-fast computers in order to determine the 3-D structure of molecules that let drugs pass through cell walls, knowledge that can be vital against cancer.

"This is completely new," Orbach wrote in the current issue of Scientific Discovery through Advanced Computing, a DOE publication. "No one has ever probed that region of science before."

The Energy Department also needs petascale computing to help solve problems that are blocking the development of nuclear fusion, an unlimited, nonpolluting energy source that's baffled designers for decades.

The DOE and NASA, the space agency, are collaborating in an effort to determine the nature of the dark energy and dark matter that are thought to make up 95 percent of the universe. Petascale computer power will be needed here, too.

#2 Live Forever

  • Topic Starter
  • Guest Recorder
  • 7,475 posts
  • 9
  • Location:Atlanta, GA USA

Posted 26 June 2006 - 03:53 PM

What implications does this have for AI research/development?

sponsored ad

  • Advert

#3 Bruce Klein

  • Guardian Founder
  • 8,794 posts
  • 242
  • Location:United States

Posted 26 June 2006 - 04:06 PM

Processing speed does not seem to be the main bottleneck... but rather software design.

#4 Kalepha

  • Guest
  • 1,140 posts
  • 0

Posted 26 June 2006 - 04:18 PM

What implications does this have for AI research/development?

Probably many, especially for intelligence agencies with human executives. For AGI research/development, toward recursively self-improving AGI executives, probably not nearly as many, as Bruce indicates.

#5 Live Forever

  • Topic Starter
  • Guest Recorder
  • 7,475 posts
  • 9
  • Location:Atlanta, GA USA

Posted 26 June 2006 - 04:42 PM

Aaah ok. I see. I heard someone say once (possibly Eliezer, but I can't be certain) that the more hardware power you have, the less intelligent you would have to be on the software side, and the less hardware power you have, it could be made up for with more intelligent coding.

#6 maestro949

  • Guest
  • 2,350 posts
  • 4
  • Location:Rhode Island, USA

Posted 26 June 2006 - 04:49 PM

I agree with Bruce. Moving the ball forward on algorithms is just, if not more important. Besides AI, many of the most important in silico challenges related to aging are at the molecular level and are computationally intractable. Saturating a few petaflops of CPU fairly easily with even the best non-polynomial simulation algorithms given a sufficient network size. And the network we're looking to simulate is more than sufficient :)

This is where the molecular dynamics behind structural simulation and protein folding has been stuck for some time thus the need for breaking up the tasks and farming them out through large scale distributed projects like Folding at Home, etc.

Biomimicry seems to be the best way to overcome these challenges. Genetic algorithms etc.

#7 arrogantatheist

  • Guest
  • 56 posts
  • 1

Posted 29 June 2006 - 03:54 AM

One thing with AI is build it and they will come. That is what Intel realized with desktop computers. The hardware has to come before the software. And its obvious why when you think about it. No one is going to build software that can't run on anything!

sponsored ad

  • Advert

#8 arrogantatheist

  • Guest
  • 56 posts
  • 1

Posted 29 June 2006 - 03:57 AM

I'd like to add this is bold funding by the US government. The US is clearly the leader supercomputing right now with over half the system in the top 500 most powerful.




1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users