• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo

IBM Fellow: Moore's Law defunct


  • Please log in to reply
33 replies to this topic

#1 Mariusz

  • Guest
  • 164 posts
  • 0
  • Location:Hartford, CT

Posted 11 April 2009 - 12:49 PM


The IBM Fellow observed that like the railroad, automotive and aviation industries before it, the semiconductor industry has matured to the point that the pace of continued innovation is slowing.

"There was exponential growth in the railroad industry in the 1800s; there was exponential growth in the automobile industry in the 1930s and 1940s; and there was exponential growth in the performance of aircraft until [test pilots reached] the speed of sound. But eventually exponential growth always comes to an end," said Anderson


Rest of the article:
http://www.eetimes.e...16403342?pgno=1

So, singularity is no longer near?
Well, this whole singularity concept was too good to be true anyway.

Mariusz

#2 kismet

  • Guest
  • 2,984 posts
  • 424
  • Location:Austria, Vienna

Posted 11 April 2009 - 06:02 PM

"The end of the era of Moore's Law, Anderson declared, is at hand" Well, that's what they say, like, every year?
Transistorwise this "law" may come to a halt within 10-15 years. But I'd expect more or less exponential performance inceases for the coming 20 years or longer. There are still somewhat advanced technologies left after the low hanging fruit: "optical interconnects, 3-D chips and accelerator-based processing [I guess this includes heterogenous processor design]"
If silicon scaling hits the wall (or the exotic materials which may replace silicon as we know it), improved yield, material and cooling will still allow us to produce bigger and faster chips at a similar price. Then they'll have more time to incrementally improve their IC design (any "re-spin" today is a slight improvement, it's just not worth to constantly re-spin old chips if you can replace them with a chip at a smaller node). Improved software and abandoning x86 may provide further growth.
Exponential or even linear growth may simply become more expensive (I think every node-transition is getting more and more expensive anyway); so, simply throw more money at the problem!
And eventually quantum computing - if it ever becomes feasible on a large scale - will enable us to solve some more exotic problems.

sponsored ad

  • Advert

#3 Cyberbrain

  • Guest, F@H
  • 1,755 posts
  • 2
  • Location:Thessaloniki, Greece

Posted 11 April 2009 - 08:32 PM

They've been saying Moore's Law will end almost every year. They said it would end in 2002 and again in 2008. Indeed it will end, but not for 15-20 years from now imo. Plus there are a huge number of arising technologies that will take over current chips.

#4 niner

  • Guest
  • 16,276 posts
  • 2,000
  • Location:Philadelphia

Posted 12 April 2009 - 03:11 AM

I don't think the singularity requires "Moore's Law" in order to happen. How many freakin' teraflops do we need, anyway? What we need is better software. Stupid software is still stupid, no matter how fast it is. So not to worry, the singularity hasn't been canceled.

#5 Connor MacLeod

  • Guest
  • 619 posts
  • 46

Posted 12 April 2009 - 07:56 AM

I don't think the singularity requires "Moore's Law" in order to happen. How many freakin' teraflops do we need, anyway? What we need is better software. Stupid software is still stupid, no matter how fast it is. So not to worry, the singularity hasn't been canceled.


I completely agree, and this is in part why I don't anticipate anything coming remotely close to "the singularity" happening any time soon.

#6 Guest_aidanpryde_*

  • Lurker
  • 0

Posted 12 April 2009 - 08:01 AM

The chief technical officer of Intel is of an other opinion.

Intel predicts "singularity" around 2050.
http://www.techwatch...larity-by-2048/

Intel is interested in catoms/claytronics, here a nice explanation:


I could not find the video of the conference regarding to this issue, but this is also nice:


I do not really concern my self with the question of will it happen or not, much more it is important for everyone of us to do everything possible to make it happen.

#7 valkyrie_ice

  • Guest
  • 837 posts
  • 142
  • Location:Monteagle, TN

Posted 12 April 2009 - 10:44 PM

The chief technical officer of Intel is of an other opinion.

Intel predicts "singularity" around 2050.
http://www.techwatch...larity-by-2048/

Intel is interested in catoms/claytronics, here a nice explanation:


I could not find the video of the conference regarding to this issue, but this is also nice:


I do not really concern my self with the question of will it happen or not, much more it is important for everyone of us to do everything possible to make it happen.


Utility fog is now claytronics. I wonder if Intel is simply trying to avoid using J Storrs Hall's terminology for possible copyright reasons?

Still, it's nice to see that the biggest player in the game is now working on the concepts needed to make UF a reality. Thats a massive step towards making it a reality.

#8 Luna

  • Guest, F@H
  • 2,528 posts
  • 66
  • Location:Israel

Posted 13 April 2009 - 03:15 AM

Moore's will live :D

#9 ihatesnow

  • Guest
  • 776 posts
  • 251
  • Location:rochester new york

Posted 13 June 2009 - 04:25 AM

http://www.newsoxy.c...ticle11654.html




http://www.ibm.com/d...i...3&S_CMP=EDU



#10 Taelr

  • Guest
  • 29 posts
  • 0
  • Location:Sunnyvale, CA

Posted 13 June 2009 - 04:22 PM

I'm not sure we need worry too much about Moore's law. Yes we do need a few years more but perhaps not too much more. Hmm, this is a very moorish post. Remember the neuron is relatively slow at an average 300Hz (kinda low tech), it is just there are 200 billion of them operating together in parallel, and the interconnects aren't even electrical. To achieve AGI we do not necessarily need very high power single processors but more parrallelism, and that feels very achievable within the next decade or so if Moores law will hold up for that time.

But beyond the first AGI, let's say human level equivalent, there is the expectation that AGI will continue to improve and supercede human levels, i.e. super intelligence, and it is that arena where I suspect we would need Moore's law to continue for a quite a bit longer, and that's where there might be some doubt.

#11 Mr. Jingles

  • Guest
  • 30 posts
  • 2

Posted 18 June 2009 - 04:28 PM

I believe that Moore's Law is feasible for a few more years in semiconductors. That will give us some time to develop optoelectronics to continue exponential growth.

Hoever, I fear that more powerful computing will unavailable to ordinary people, and therefore truly massive parallel computing will be delayed.

Netbooks are being promoted as the next computing platform for consumers. What that means is that most people will have just enough computing power for computer driven consumerism. The result of this is that the demand for more powerful processing will decrease, which means that there will be less produced in factories. That will drive up the cost per unit dramatically, resulting in very cheap consumer class computing devices (consumer electronics gadgets) and very expensive "institutional class" computing devices (powerful server-type machines).

Because of this, the client-server model could persist well into the future. What we should be embracing are more decentralized computing models. Distributed computing projects would fail in a world where everyone had a netbook and only institutions had powerful processing.

#12 Athanasios

  • Guest
  • 2,616 posts
  • 163
  • Location:Texas

Posted 18 June 2009 - 05:31 PM

I am unsure the netbook craze will hurt the design of larger chips. People will still be asking more and more out of netbooks. Sure, focus will be on power usage, heat, and size in relation to the computational power, but would that not also be beneficial for larger chips and cloud providers?

I do not see how distributive computing is any better than a cloud ran by google. Distributive computing may be great marketing, a way to keep real total costs hidden, and good use of a market inefficiency, but I do not see the argument for it over making the market more efficient by reducing computational waste and taking into account total real costs.

I am not trying to 'argue' as much as gain insight into why my assumptions may not be true.

As for Moore's law, I think it all depends on how you define it. It will last as long as the definition's focus is 'performance' and you allow the definition of 'performance' to shift according to need.

#13 MicroBalrog

  • Guest
  • 10 posts
  • 0

Posted 19 June 2009 - 01:38 AM

Moore's law is defined in chip density. It is not directly tied to performance.

#14 niner

  • Guest
  • 16,276 posts
  • 2,000
  • Location:Philadelphia

Posted 19 June 2009 - 02:26 AM

Moore's law is defined in chip density. It is not directly tied to performance.

Good point. Everyone seems to equate it with performance anyway, which while not technically correct is a pretty good approximation. Anyway, I'll make the following assertion: Powerful AGI doesn't necessarily need faster hardware; it needs better knowledge representations and algorithms. The only reason that near-infinite power is needed is for the brute force Blue Brain approach. Not that there's anything wrong with that... Brute force has a lot of advantages these days. Maybe the way the Singularity will happen is we will build the first superhuman AI using brute force, then it will figure out how to do it as an iPhone App.

#15 Athanasios

  • Guest
  • 2,616 posts
  • 163
  • Location:Texas

Posted 19 June 2009 - 02:49 AM

Moore's law is defined in chip density. It is not directly tied to performance.

Exactly. The forces driving it are doing so in search of better performance. When we are forced into a paradigm shift to get more performance, the practical aspect of the law will continue regardless of the effect on the very strict definition's outcome.

#16 Mr. Jingles

  • Guest
  • 30 posts
  • 2

Posted 21 June 2009 - 07:58 PM

People will still be asking more and more out of netbooks.

Software requirements follow hardware performance, not the other way around. People don't write software that no computer can run.

People's computing demands are driven by the software they want to run. The major exception to this is when users get into the habit of multitasking. That will push up their computing resource needs, but probably only linearly, and certainly not exponentially.

Another major exception are graphics-driven applications. Those needs could be exponential over time. Hardware makers could make it purposely difficult to use the graphics processing for general purpose applications.

If hardware makers cooperated instead of competing, they could dramatically drive down their production (not to mention research and development) costs by agreeing upon that (artificial) limit (without a very large price premium). Thus, software for that platform requiring additional power would go unwritten, thereby not driving consumer demand for computational power upward.

I do not see how distributive computing is any better than a cloud ran by google. Distributive computing may be great marketing, a way to keep real total costs hidden, and good use of a market inefficiency, but I do not see the argument for it over making the market more efficient by reducing computational waste and taking into account total real costs.

I thought that cloud computing was distributed. Are you talking about a centrally-managed distributed computing? To be honest, the recent buzz about cloud computing sounds more like marketing than distributed computing. I would like to hear more about it with specific definitions that I can understand.

What I mean is, I think that a business model where ordinary people have proprietary devices with fixed (and minimized) computational power will drive up the cost of more flexible, powerful machines (like the PCs we are used to). They will only be affordable by institutions and the wealthy.

I believe that that would decimate the total surplus processing of humanity. The only ones who could then meaninfully contribute to distributed computing projects such as Folding@home would be entities with no incentive to do so. To me, that's the scariest part.

You bring up great points. You mentioned you were interested in seeing weaknesses in them. I assure you, I am even more hopeful that I am flat wrong, because otherwise the decisions as to what "very difficult problems" to try to solve will be decided by a very few people, certainly not us.

#17 Athanasios

  • Guest
  • 2,616 posts
  • 163
  • Location:Texas

Posted 21 June 2009 - 09:36 PM

You bring up great points. You mentioned you were interested in seeing weaknesses in them. I assure you, I am even more hopeful that I am flat wrong, because otherwise the decisions as to what "very difficult problems" to try to solve will be decided by a very few people, certainly not us.


I understand your points and think them valid. However, I also have thought these arguments valid since I first laid my hands on a personal computer, back when it was just a calculator or word processor at most. Human ingenuity seems to always come up with a need or want, ever since that first use of the bone as a weapon in 2001 S.O. When netbooks and phones are up to people's demands, it will be the processors in glasses that paint your eye with an image, or whatever. Is there something that inherently makes smaller processor advancements not applicable to larger ones? I may be out of my realm of expertise but it seems that going smaller would affect larger ones exponentially.

I also think there will be just as much, if not more, demand for powerful enterprise computing in the future as we rely more and more on computation for advancement. With cloud like computing, if it is cheaper to use a farm of small consumer computers, it will be done that way.

BTW, I also consider cloud computing technically to be distributive. I thought you may have meant distributive in the marketing term sense, as in connecting a bunch of individual's personal computers.

#18 Athanasios

  • Guest
  • 2,616 posts
  • 163
  • Location:Texas

Posted 21 June 2009 - 09:41 PM

I believe that that would decimate the total surplus processing of humanity. The only ones who could then meaninfully contribute to distributed computing projects such as Folding@home would be entities with no incentive to do so. To me, that's the scariest part.

Folding@home and others would just buy computation with donations. Right now, people are donating money in terms of energy costs when they run it. What makes it cost effective now over a donation is that people have been forced to buy more processing than they want. This is a market inconsistancy, that if you are a believer in free markets, we would be better without.

#19 Mr. Jingles

  • Guest
  • 30 posts
  • 2

Posted 23 June 2009 - 04:20 PM

Folding@home and others would just buy computation with donations. Right now, people are donating money in terms of energy costs when they run it. What makes it cost effective now over a donation is that people have been forced to buy more processing than they want. This is a market inconsistancy, that if you are a believer in free markets, we would be better without.

I am afraid that in the scenario I described, processing costs would go up rather than diminish.

I believe that markets are manipulated by market makers, and they always maximize their own wealth (usually in the form of profit). In America, people demand higher and higher Internet capacity. Telecoms don't care what people demand. People will pay the same for less because they have no choice. There is no good reason other than profit maximization that Americans should not have a twentyfold increase in their connection speed. Although connection capacity is growing ever cheaper, they intend to implement a price structure like that of mobile telephones. It is a manipulated market, not a free one.

I think I may have been very wrong in my thoughts concerning computer graphics processing. 3D rendering at very high resolutions with bleeding-edge effects consumes vast computation. Once that rendering is done, however, it can be played on your Blu-Ray player at a minimal cost (but high data size). For gaming, in order to be intereactive, the computer needs to do a lot of calculations to get the graphics right. For the consumer market, as long as the icons are shiny and boucy enough, it's okay with them.

Today I saw an extension to X-Windows that has been around for some time. It has very good visual effects, and my colleague said they worked on his AMD Athlon 2000 with a very small amount of RAM. Why would they offer more processing when most people don't know what to do with it?

That gives me a bad feeling in the pit of my stomach. It will price powerful computing out of reach.

#20 Medical Time Travel

  • Guest
  • 126 posts
  • 2

Posted 12 January 2010 - 03:34 AM

It seems to me that spintronics is the natural extension of electronics and therby Moore's law:

http://en.wikipedia....iki/Spintronics
http://www.nanotech-...spintronics.htm

#21 valkyrie_ice

  • Guest
  • 837 posts
  • 142
  • Location:Monteagle, TN

Posted 12 January 2010 - 06:32 AM

It seems to me that spintronics is the natural extension of electronics and therby Moore's law:

http://en.wikipedia....iki/Spintronics
http://www.nanotech-...spintronics.htm



Three developments since this.

DNA assembly of CNT structures
Graphene process to replace copper in chips
development of ink usable to print entire electronic devices.

All three of which will lead to faster, lower power using, cheaper chips.

Moore's law will indeed hit a plateau. But not for some time yet.

Chip development has always outpaced software development. It will for probably another couple of decades, but it is increasing in complexity rapidly. What we are primarily improving is our ability to sort and catalog the massive amounts of Data we create. And that has infinitely more true use than improving game code or utility programs like word processors. The amount of scientific knowledge available to data mining techniques is staggering, but as yet only poorly accessed. Refinements in how we can cope with massive data sets will be a defining point in our advance to human enhancement.

#22 imm1288

  • Guest
  • 15 posts
  • 2

Posted 12 January 2010 - 07:10 AM

If you look at the Wikipedia articles on the CMOS roadmap, it would seem to suggest that there is at least a half-decade of Moore's law left, but maybe not 15 years, unless additional scientific advances are made (granted, Intel has a great track record with this):

http://en.wikipedia....ki/16_nanometer

The article on the 11nm process seems to suggest that further progress becomes questionable at that point due to quantum tunneling:

http://en.wikipedia....ki/11_nanometer

As a software engineer myself, I agree with the other commenters who have stated that the main problem with software is not slow processors, it's stupid software. Software design patterns are narrowly focused on very specific business objectives. While this is appropriate for a business setting, it's worth pointing out that software hasn't become more effective at the same rate that Moore's law has extended the complexity of chips.

That said, there is some smart software that does benefit strongly from Moore's law. In my opinion, the biggest opportunity for applying IT to biotechnology and life extension is in realistic modeling software that simplifies analysis of the human body. One of the main barriers to more effective biotech is the sheer complexity of the human body. Moore's law can be directly applied to this kind of analysis, by making it possible to run simulations more easily, quickly, and realistically, which would significantly ease the process of developing future medical advances. This is the one reason why I am excited by Moore's law in a biotechnology context, and also why I'm excited about the advances of Folding@Home (not just their processing power but also their algorithmic advances, which I think they have open-sourced).

Edited by imm1288, 12 January 2010 - 07:15 AM.


#23 n25philly

  • Guest
  • 88 posts
  • 11
  • Location:Holland, PA

Posted 12 January 2010 - 03:17 PM

Odds are that when the singularity arrives computers will be very different than they are today. Newer technologies like memristors could be a key if they take off. Even if they don't I see what is going on right now with gpu's to be a sign of the future. Parallel processing is going to take over at some point in the future. The CPU will likely still be there, but more and more the gpu's and the power they can provide are becoming more and more important in regular computing.

#24 Elus

  • Guest
  • 793 posts
  • 723
  • Location:Interdimensional Space

Posted 12 January 2010 - 06:54 PM

I'm surprised at this thread. You guys aren't mentioning the obvious: Moore's law is just one of many paradigms. It's a bit silly to think that our chips will continue to utilize only 2 dimensions.

As Mr. Kurzweil has repeatedly said - Moore's law will run out of steam after we have established 3 dimensional molecular circuits.

/thread

#25 babcock

  • Guest
  • 299 posts
  • 73
  • Location:USA

Posted 12 January 2010 - 07:20 PM

Just going to throw this article into the fray in an attempt to remind everyone that we are on quantum computing's doorstep.

http://www.scienceda...00111091222.htm

IMO Moore's law will still continue producing gains until we hit quantum computing. As was already stated in this thread researchers are finding nano materials to replace copper/typical materials used in semi conductors as well as studying using living cells to perform computation. Moore's law won't cease to produce until processing units take on a different form then they currently are. I highly doubt we will hit the wall proposed at the end of the curve and sit there for any time period. Researchers have been reinventing ways to pack/make transistors since Moore's law was created.

#26 PWAIN

  • Guest
  • 1,288 posts
  • 241
  • Location:Melbourne

Posted 13 January 2010 - 12:20 AM

The article is a bit unclear as to what has actually been achieved...

Taiwan leads in 16-nm SRAM revolution

http://www.taiwantod...7144&CtNode=416

#27 acrossuniv1

  • Guest
  • 19 posts
  • 0

Posted 13 January 2010 - 12:54 AM

Rather than wondering when or if "singularity" will happen, shall we instead do something to help make it happen.

#28 Reno

  • Guest
  • 584 posts
  • 37
  • Location:Somewhere

Posted 13 January 2010 - 01:11 AM

Rather than wondering when or if "singularity" will happen, shall we instead do something to help make it happen.


What would you suggest across?

#29 acrossuniv1

  • Guest
  • 19 posts
  • 0

Posted 13 January 2010 - 01:30 AM

Rather than wondering when or if "singularity" will happen, shall we instead do something to help make it happen.


What would you suggest across?


I would suggest the following. They are actually quite common and frequently used strategy. But I think the key is to do it now. For those who want to see singularity happen, maybe this is what can do to help:

1. form a transparently administrated "singularity promotion fundation" to invest in facilitating creat new technology for singularity, and donate to it. I am sure for example current artificial intelligence research definitely have room for more funds to make discovery quicker.
2. form a singularity forum to collectively suggest more ideas (even from semi-professional or layman) on novel technology
3. form a community forum to discuss, get ideas for development and promote "the goal of singularity"

Thanks

sponsored ad

  • Advert

#30 imm1288

  • Guest
  • 15 posts
  • 2

Posted 13 January 2010 - 08:12 AM

I'm surprised at this thread. You guys aren't mentioning the obvious: Moore's law is just one of many paradigms. It's a bit silly to think that our chips will continue to utilize only 2 dimensions.

As Mr. Kurzweil has repeatedly said - Moore's law will run out of steam after we have established 3 dimensional molecular circuits.

/thread


I think that Moore's Law has to do with the number of transistors on an integrated circuit, so it wouldn't matter whether it is 2-dimensional or 3-dimensional.

I agree with other posters who note that specialized processing units (e.g. GPUs, but many other kinds as well) will become more important vis-a-vis the CPU in the future. CPU power is still very important right now, though, because it doesn't require a rewrite of existing software, which is costly and time-consuming.




1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users