If I understand the goal of the BlueGene project correctly they are attempting to build a lookup table for folding pathways and kinetics based upon protein structures. To accomplish this lofty goal (given the immensely vast number of protein permutations which exist in nature) they're throwing a previously unheard of petaflop of processing power at the problem.
Now, if I understand correctly protein folding was the biggest barrier in building a complete computer model of biomolecular behavior within a cell, being at present so computationally costly to model as to be worthless. With a lookup table of how protein folds, the computational cost is eliminated because the computations have already been done and the answers can be stored for fast retrieval rather than having to be computed each time.
This leads me to wonder how long until we see computer simulations of the complete biochemical behavior of multicellular lifeforms. How long until we can model sponge colonies? Insects? Simple vertebrates? Fish? Reptiles? Mammals? Man?
Kurzweil estimates that by 2009 a $1000 personal computer will have reached teraflop performance levels (equivalent to the supercomputer ASCI Red which held the #1 most powerful supercomputer slot for many years after its construction) and supercomputers will have reached the equivalent computational power of the human brain - 20 petaflops, or 20 times as powerful as BlueGene/L.
Imagine a future supercomputer (or Internet network of personal computers similar to BOINC, the Berkeley Open Infrastructure for Network Computing) which is given the input of a fertilized human egg that proceeds to grow a biomolecular model of a human being.
20 petaflops probably isn't going to be enough, but imagine if you could get half a million teraflop PCs working on the problem in their spare CPU time? Even if you couldn't model in realtime what you'd eventually end up with, maybe a few years of computation down the road (remember that computing the problem gets exponentially faster thanks to Moore's law), is a fully developed human baby, inside of a computer. And right there you have an entire bona fide blueprint of consciousness (or what will become consciousness if you let the simulation run longer) sitting inside of a computer. From that you can analyze the biomolecular operation of the brain and hopefully simplify its operation to a substantially simpler structure. Then, figure out how it works and start improving it. Load it into a profiler (if only natural selection had one of those), look for the bottlenecks, and eliminate them. By the time we can perform this level of biomolecular modelling, supercomputers will have vastly exceeded the computational power of the brain.
And you know what that means...
From this can't we reasonably conclude that the latest possible date for the Singularity is when this kind of biomolecular modelling becomes computationally feasible?