• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo

The Singularity and Nay Sayers


  • Please log in to reply
7 replies to this topic

#1 Lazarus Long

  • Life Member, Guardian
  • 8,116 posts
  • 242
  • Location:Northern, Western Hemisphere of Earth, Usually of late, New York

Posted 27 July 2009 - 05:21 PM


This article mentions all the issues from the positives to chicken littles.

http://www.nytimes.c.../...tml?_r=1

Scientists Worry Machines May Outsmart Man

By JOHN MARKOFF
Published: July 25, 2009

A robot that can open doors and find electrical outlets to recharge itself. Computer viruses that no one can stop. Predator drones, which, though still controlled remotely by humans, come close to a machine that can kill autonomously.

Impressed and alarmed by advances in artificial intelligence, a group of computer scientists is debating whether there should be limits on research that might lead to loss of human control over computer-based systems that carry a growing share of society’s workload, from waging war to chatting with customers on the phone.

Their concern is that further advances could create profound social disruptions and even have dangerous consequences.

As examples, the scientists pointed to a number of technologies as diverse as experimental medical systems that interact with patients to simulate empathy, and computer worms and viruses that defy extermination and could thus be said to have reached a “cockroach” stage of machine intelligence. (excerpt)



#2 forever freedom

  • Guest
  • 2,362 posts
  • 67

Posted 27 July 2009 - 05:35 PM

I'm not sure but i have the impression that more and more people are starting to accept/consider the possibility of a future scenario such as Kurzweil paints it. Were these ideas so mainstream before, like a decade before? It could be just that every sort of meme is becoming more widespread because of the internet, though.

sponsored ad

  • Advert

#3 Mind

  • Life Member, Director, Moderator, Treasurer
  • 19,146 posts
  • 2,000
  • Location:Wausau, WI

Posted 27 July 2009 - 08:22 PM

I think it is just becoming more obvious to people that machines/software are becoming more "capable". If you want to call it intelligence or not, perhaps debatable, but there is no doubt that machines are becoming more pervasive and invasive (more intimately interfacing with the human mind). If the Association for the Advancement of Artificial Intelligence is not talking about it (singularity concept), then they are irrelevant.

#4 niner

  • Guest
  • 16,276 posts
  • 1,999
  • Location:Philadelphia

Posted 27 July 2009 - 08:48 PM

Great article. I didn't get the sense that they were naysayers on the singularity, more that they were recognizing that given the spectrum of technologies from predator drones to data mining on consumers to autonomous vehicles, there are potential areas for concern that we should look at. I thought this was pertinent:

The meeting on artificial intelligence could be pivotal to the future of the field. Paul Berg, who was the organizer of the 1975 Asilomar meeting and received a Nobel Prize for chemistry in 1980, said it was important for scientific communities to engage the public before alarm and opposition becomes unshakable.

“If you wait too long and the sides become entrenched like with G.M.O.,” he said, referring to genetically modified foods, “then it is very difficult. It’s too complex, and people talk right past each other.”


There could be a lesson here for the life extension community...

#5 Reno

  • Guest
  • 584 posts
  • 37
  • Location:Somewhere

Posted 28 July 2009 - 07:18 PM

This will be one of those things the general public will have to see to believe. Sure there will be groups like this debating it, but those of us which foresee it are in the far far minority. I think AI right now is what the idea of cell phones were in the 70s. At the moment it's just those nerdy people's comic book pipe dream. I think the real problems with AI won't be predictable ones.

Edited by bobscrachy, 28 July 2009 - 07:21 PM.


#6 kismet

  • Guest
  • 2,984 posts
  • 424
  • Location:Austria, Vienna

Posted 28 July 2009 - 09:34 PM

There could be a lesson here for the life extension community...

Or for the public at large? (incl. religious and conservative luddites) Get off your censorship horse and stop oppressing technologies you simply don't understand (well, I can dream, can I?)

#7 niner

  • Guest
  • 16,276 posts
  • 1,999
  • Location:Philadelphia

Posted 29 July 2009 - 02:57 AM

There could be a lesson here for the life extension community...

Or for the public at large? (incl. religious and conservative luddites) Get off your censorship horse and stop oppressing technologies you simply don't understand (well, I can dream, can I?)

Yeah, I know how you feel. They don't understand them, but that doesn't stop them from voting against them. That's why you want to attempt to educate them on the subject, or at least present it in the least scary way, before they do something harmful out of ignorant fear.

sponsored ad

  • Advert

#8 modelcadet

  • Guest
  • 443 posts
  • 7

Posted 29 July 2009 - 07:45 AM

I'm still searching for an advisor to assist me as I figure out how to release my own thoughts. I have gone to several people who frankly don't know what I'm talking about at all, and only provide empty support. It just stinks SIAI won't hear my perspective yet, but that's sort of an Eddingtonesque challenge and should help me refine my work (although more helpful would be some funding for both philosophies).

It's definitely too early to be pushing politically, and I think we're making great inroads with main street and wall street. Just want to note how much of that is Klein's doing, not just Kurzweil's. I'm still trying to figure out how to bridge the divide between transhumanist memes and mainstream academia, though progress is too slow to make a difference at the moment. I really hope I can move to the west coast after I graduate from UVa, so I may at least be near more transhumanists; I remember Dr. Goertzel commenting about how lonely it can feel in Maryland, and I can attest to similar sentiments in Virginia.

Sorry for the bitter rambling. I'll keep soldiering on, selling blunts an 40s to pay for a worthless degree, still only wishing anybody around me gave enough of a shit to help us save the world.




1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users