exapted,
You are spot on about software bloat, miniaturization, new computing paradigms, etc. Another factor is energy efficiency. While you may be right that moore's law could be outpaced, there are a number of metrics by which all this must be judged. Clearly we can overclock current CPUs far higher than older ones. Some can go over 6 gigahertz in overclocking competitions, but clearly the power consumption and wear on the chips are enormous. So trends based on simple price/performance ratios which don't factor in durability and energy efficiency are dubious.
Software bloat seems to have been decreasing as a result of Linux and Mac putting pressure on Windows to strip down, and we are seeing some cool new compilers like Clang and new GPGPU frameworks like OpenCL. Yet, games are just as demanding as ever, if not more demanding. With ray tracing, the sky is the limit. Even the new larrabee chip which is in development, which has 32 cores, is only barely able to do real-time ray tracing. I have a feeling that even as the "bloat" goes away, it will likely be more than compensated for by new features that will quickly be seen as essential. Higher definition video, stereoscopic 3D, haptics, you name it.
As for new computing paradigms, there are a number of promising avenues. Heterogenous computing on a single chip using different specialized cores for specific tasks could provide immense performance improvements. Another promising approach is using biological circuits that can reconfigure themselves on demand based on new designs which have been downloaded or developed as a result of genetic algorithms.
Yet another exciting new computing paradigm is the memristor. We might see a unified memory based on this, which would be more efficient, denser, and faster than either RAM or HDDs. It may also be especially useful for certain niche applications and algorithms, though less specialized than quantum computers.
I don't think it really matters which motivation is the primary driver for advancement. Just like how the urge for thinner, brighter displays led to the discovery of OLED which also happened to be more efficient, flexible, etc etc. Perhaps it is even good that the scientists are tackling the issues from many fronts.
Of course even with all of this, our knowledge of the brain even from a reductionist standpoint is pretty pitiful, and it is really hard to make any predictions because of this ignorance.
Furthermore, it should go without saying that even when we have a good idea of how the brain works, there really isn't any way of knowing if the resulting intelligence would have qualia or would be a "p-zombie." Thus, we shouldn't expect to be able to upload ourselves at least onto any digital substrate. We would need to do it to another substrate which would be homologous to our biological substrate, and perhaps nothing else is and instead we need to figure out how to improve our wetware. Artificial brains can be great research assistants, but the ethics around this whole issue is very murky and caution should be paramount.
Edited by progressive, 08 October 2009 - 01:44 AM.