• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo

How accurate are Ray Kurzweil's predictions?

kurzweil singularity breakthroughs biomedicine dna sequencing computing brain artificial intelligence robotics

  • Please log in to reply
280 replies to this topic

#271 bluemoon

  • Guest
  • 761 posts
  • 94
  • Location:south side
  • NO

Posted 08 January 2022 - 07:43 PM

 

 

I don't think advances in computing are continuing along the exponential path Kurzweil predicted. The signs are everywhere that progress is running out of steam. 

 

From six weeks ago: 

 

Intel CEO Pat Gelsinger promised at the company's online Innovation event that "Moore's law is alive and well," adding that “we are predicting that we will maintain or even go faster than Moore's law for the next decade […] we expect to even bend the curve faster than a doubling every two years.”

 

https://www.tti.com/...k-20211124.html



#272 QuestforLife

  • Member
  • 1,220 posts
  • 858
  • Location:UK
  • NO

Posted 08 January 2022 - 09:33 PM

From six weeks ago:

Intel CEO Pat Gelsinger promised at the company's online Innovation event that "Moore's law is alive and well," adding that “we are predicting that we will maintain or even go faster than Moore's law for the next decade […] we expect to even bend the curve faster than a doubling every two years.”

https://www.tti.com/...k-20211124.html


Well if the CEO of Intel says it's so, then it must be so!
  • Cheerful x 1

sponsored ad

  • Advert

#273 bluemoon

  • Guest
  • 761 posts
  • 94
  • Location:south side
  • NO

Posted 09 January 2022 - 04:38 PM

Well if the CEO of Intel says it's so, then it must be so!

 

The CEO said in 2009 as well that he was confident Moore's Law would continue at least to 2030.

There has been no slowdown from 2010 to 2021.

 

Where is the slow down in the first week of 2022?


  • Good Point x 1

#274 Dream Big

  • Guest
  • 59 posts
  • 89
  • Location:Canada

Posted 18 January 2022 - 01:29 AM

I don't see signs that an AI capable of symbolic manipulation and reasoning is "near", whatever that means. Besides, should we not be worried about the sanity of such a thought-generating machine? And then there's the issue of consciousness - how powerful can a non-conscious AI be?

 

While Microsoft's NUWA and OpenAI's DALL-E might seem like all they do is predict the rest of image/video and are useful editing tools, they are going to be more than that. All that human brains do is predict, we predict the rest of image, video, sound, etc, and we predict what our goal should be and what type of data to collect, or whatever. What NUWA lacks right now is goals and reasoning ability. Once it can decide what it's job should be and figure out complex trick questions like what was the first and last letters of this post I just wrote, it will be more like us.



#275 Dream Big

  • Guest
  • 59 posts
  • 89
  • Location:Canada

Posted 18 January 2022 - 03:51 AM

Also don't forget, hardware improvements may slow down for now, but the new Moore's law is software, to make AI run faster. Many people now can clone AI algorithms and work at home on them fast trying to optimize them to make them faster or more RAM friendly. With hardware speedups you'd need many people working in labor and slowly building the same computer but in their "own way" to try to improve the existing computer, and none of us can do that.



#276 QuestforLife

  • Member
  • 1,220 posts
  • 858
  • Location:UK
  • NO

Posted 20 January 2022 - 09:35 AM

Also don't forget, hardware improvements may slow down for now, but the new Moore's law is software, to make AI run faster. Many people now can clone AI algorithms and work at home on them fast trying to optimize them to make them faster or more RAM friendly. With hardware speedups you'd need many people working in labor and slowly building the same computer but in their "own way" to try to improve the existing computer, and none of us can do that.

 I had an Amiga 500 in the 90s and the hardware was almost completely static (I did upgrade the RAM from 500kb to 1MB!) Yet the games improved immeasurably in that time, due to better programming, i.e. better use of the existing hardware. Of course they eventually came up against the limitations of the hardware (it had no hard drive), and were forced to use like 10 floppy discs per game :)

 

So saying software is the new Moore's law is an acknowledgement that hardware speed improvements have slowed down (or stopped). It is clear to me looking at the PCs I have owned from 2000 to now that hardware improvements have slowed down. They have to some extent found ways around that, say using more cores when core speeds got stuck, doing more work in the GPU rather than CPU, having faster RAM and of course, SSDs. But at the same time the programming burden of operating systems like Windows has grown, to the extent that my last couple of computers have hardly got any faster. 

 

Now there is an argument that we might already have (perhaps in supercomputers),  enough operations per second to at least emulate what the human brain can do (say with much faster processing in each CPU than in a neuron, but having far less cores than humans have neuron connections). But the estimations of the number of processes the human brain can do per second seem to vary wildly. So I can't be sure.

 

My guess is that using the right shortcuts (i.e. heuristics) we could make an intelligence that captured at least superficially what we are like. But finding those heuristics could be very tricky, after all we have had evolution to do that for us using trial and error for a long, long time. 

 

In any case, I don't think we need to develop a human like intelligence to solve aging (the main discussion point on this forum). We already have humans for that. Raw computing power can help humans solve it, but if aging is solved it will be humans not AI that do it.



#277 Mind

  • Life Member, Moderator, Secretary
  • 17,519 posts
  • 2,000
  • Location:Wausau, WI

Posted 21 January 2022 - 07:13 PM

 I had an Amiga 500 in the 90s and the hardware was almost completely static (I did upgrade the RAM from 500kb to 1MB!) Yet the games improved immeasurably in that time, due to better programming, i.e. better use of the existing hardware. Of course they eventually came up against the limitations of the hardware (it had no hard drive), and were forced to use like 10 floppy discs per game :)

 

So saying software is the new Moore's law is an acknowledgement that hardware speed improvements have slowed down (or stopped). It is clear to me looking at the PCs I have owned from 2000 to now that hardware improvements have slowed down. They have to some extent found ways around that, say using more cores when core speeds got stuck, doing more work in the GPU rather than CPU, having faster RAM and of course, SSDs. But at the same time the programming burden of operating systems like Windows has grown, to the extent that my last couple of computers have hardly got any faster. 

 

Now there is an argument that we might already have (perhaps in supercomputers),  enough operations per second to at least emulate what the human brain can do (say with much faster processing in each CPU than in a neuron, but having far less cores than humans have neuron connections). But the estimations of the number of processes the human brain can do per second seem to vary wildly. So I can't be sure.

 

My guess is that using the right shortcuts (i.e. heuristics) we could make an intelligence that captured at least superficially what we are like. But finding those heuristics could be very tricky, after all we have had evolution to do that for us using trial and error for a long, long time. 

 

In any case, I don't think we need to develop a human like intelligence to solve aging (the main discussion point on this forum). We already have humans for that. Raw computing power can help humans solve it, but if aging is solved it will be humans not AI that do it.

 

This reminds me about the "slow down" in hardware improvements. I remember when clock speeds we the "trendy" measure of progress. Overclocking was a fun thing to try. Clock speeds hit a hard wall at least 15 years ago, it seems. No one talks about that anymore.



#278 Mind

  • Life Member, Moderator, Secretary
  • 17,519 posts
  • 2,000
  • Location:Wausau, WI

Posted 21 January 2022 - 07:16 PM

He is a neat visual of the current super computers in the world. Funny how they trivialize the computing power of the top graphics card in the world. That graphics card does more than a supercomputer from just a few years ago...lol.

 

https://www.visualca...supercomputers/



#279 bluemoon

  • Guest
  • 761 posts
  • 94
  • Location:south side
  • NO

Posted 23 January 2022 - 08:11 AM

This reminds me about the "slow down" in hardware improvements. I remember when clock speeds we the "trendy" measure of progress. Overclocking was a fun thing to try. Clock speeds hit a hard wall at least 15 years ago, it seems. No one talks about that anymore.

 

I don't get the slowdown talk. Here is a graph of Moore's Law on Wikipedia:

 

https://en.wikipedia...t_1970-2020.png

 

Kurzweil has used Hans Moravec's graph of millions of instructions per second (MIPS) per $1,500 in 1995 dollars and that line has also kept increasing up and to the right although haven't seen an update since before the pandemic.



#280 QuestforLife

  • Member
  • 1,220 posts
  • 858
  • Location:UK
  • NO

Posted Yesterday, 10:25 AM

I don't get the slowdown talk. Here is a graph of Moore's Law on Wikipedia:

 
 
 
The talk of slow down comes from the experience of buying a new computer every 3 or 4 years over decades. From that perspective advances have slowed, although admittedly prices have also fallen so you could argue speed/$ has continued to improve. But even there, I think cost savings have also stalled. This might be something to do with the exponential rise in chip fabrication costs as they get smaller and smaller (see Moore's second law).
 
Moore's (first) law has largely relied on the reduction in component size, but only Taiwan and South Korea currently have the capability to manufacture 5nm. That is how hard it's got. This is where the next big innovation is needed.


Edited by QuestforLife, Yesterday, 10:26 AM.


sponsored ad

  • Advert

#281 bluemoon

  • Guest
  • 761 posts
  • 94
  • Location:south side
  • NO

Posted Yesterday, 08:31 PM

 
 
 
The talk of slow down comes from the experience of buying a new computer every 3 or 4 years over decades. From that perspective advances have slowed, 

 

There certainly wasn't any slowdown during the Amiga period, 1985 to 1994. There may be a slowdown with respect to the original definition of Moore's Law, although I'd like to see a graph that show this. As far as I can see, there has not been a slowdown in Kurzweil's Law of Accelertaion. 

 

Here is Moore's Law again:

-RQrwwEV7IX6CZXSEyUN6KCsOPIwouwWVgnSgIgF







Also tagged with one or more of these keywords: kurzweil, singularity, breakthroughs, biomedicine, dna sequencing, computing, brain, artificial intelligence, robotics

1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users