• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo

Robots can learn to lie


  • Please log in to reply
8 replies to this topic

#1 Shannon Vyff

  • Life Member, Director Lead Moderator
  • 3,897 posts
  • 702
  • Location:Boston, MA

Posted 17 January 2008 - 04:08 AM


http://discovermagaz...earn-how-to-lie

Well, they have a way to go before they could be as good at it as us :) they evolved heroes too though!

Edited by Shannon, 17 January 2008 - 04:11 AM.


#2 Live Forever

  • Guest Recorder
  • 7,475 posts
  • 9
  • Location:Atlanta, GA USA

Posted 17 January 2008 - 05:02 AM

Liar, liar, pants on fire!

sponsored ad

  • Advert

#3

  • Lurker
  • 0

Posted 17 January 2008 - 07:27 AM

Hmm. This is both interesting, insightful and highly amusing.

I wonder if there's a place where I can have a look at the code?

#4 Ghostrider

  • Guest
  • 1,996 posts
  • 56
  • Location:USA

Posted 17 January 2008 - 08:12 AM

Hmm. This is both interesting, insightful and highly amusing.

I wonder if there's a place where I can have a look at the code?


My computer lies to me all the time. It's called debugging a program.

#5

  • Lurker
  • 0

Posted 17 January 2008 - 08:42 AM

My computer lies to me all the time. It's called debugging a program.

Huh?

Wouldn't it be more correct call it a "bugged program"? With "debugging a program" being defined as an act of preventing it from lying in the future?

That, and since it was you who wrote the program in the first place and the computer merely executed your oders, wouldn't a bugged program translate into "I am lying to my own self due to my limited ability of speaking the truth"?

Edited by Hudzon, 17 January 2008 - 08:43 AM.


#6 lucid

  • Guest
  • 1,195 posts
  • 65
  • Location:Austin, Tx

Posted 17 January 2008 - 09:09 AM

http://discovermagaz...earn-how-to-lie

Well, they have a way to go before they could be as good at it as us :) they evolved heroes too though!

Interesting. What is determines whether this is interesting or not is how generally the intelligence is coded. This could be done with a rather rigid intelligence system where the robot could choose to do anyone of its 10 or so actions based on its inherited 'genetic' queues. From even a very simple rigid system it would not be hard to get the more complicated results such as 'cheators' and 'heroes'. Further if they were able to achieve such things in 50 generations then it is more likely that the robots were more rigidly programmed than generally.

#7 Shannon Vyff

  • Topic Starter
  • Life Member, Director Lead Moderator
  • 3,897 posts
  • 702
  • Location:Boston, MA

Posted 17 January 2008 - 06:54 PM

Hmm. This is both interesting, insightful and highly amusing.

I wonder if there's a place where I can have a look at the code?



wouldn't hurt to try contacting them ;)

#8 nefastor

  • Guest
  • 304 posts
  • 0
  • Location:France

Posted 19 February 2008 - 02:53 AM

Liar, liar, pants on fire!


I guess the robot version would be "liar, liar, wheels on fire !" :p

Seriously, tho, it makes perfect sense robots could learn to lie : creatures evolving in an environment with limited resources (such as a planet you can't leave) have to compete for resources. The most advanced tools in this competition involve misdirecting other creatures (away from a food source or a good shelter, for example).

I always laugh when I see books from highly-successful people (say, "the Donald") that are supposed to reveal their secrets on how to become rich : it's obvious you can't be rich unless others are poor, in a limited-resource environment, so the better you succeed, the less likely you are to tell others how to. It takes very little intelligence (common sense) to get to that conclusion.

From there, you're always going to lie. The means you'll employ are only constrained by whatever ethics (or lack thereof) plague your mind : it goes from willful omission to outright fabrication.

For examples covering the entire "lie spectrum", just have a look back at everything Georges W. Bush ever said. He's one for the school books. If he can lie so much (I wouldn't say "so well"), and considering his IQ is lower than his shoe size, how hard can it be ? :~

Besides, even if robots don't learn how to lie by themselves, we have a vested interest in teaching them. Suppose you want to market a house-keeping robot : will people buy a machine who repeatedly observes they are dorky slobs, or a machine who ceaselessly approves of their "lifestyle choices" ? Market economy calls for robots that can lie, therefore engineers will be paid to make sure they can.

I'm betting Microsoft will get there first : Windows already lies by omission. It's not telling you ALL the processes that are running on your PC at any one time, nor why it runs out of page file capacity while at the same time showing 10% RAM usage. Talk about cadavers in the closets !

Call me when they make a robot that can evolve ethics and feel bad about telling lies : THAT will be something.

Nefastor

sponsored ad

  • Advert

#9 Grail

  • Guest, F@H
  • 252 posts
  • 12
  • Location:Australia

Posted 20 February 2008 - 12:30 AM

I'm not sure I understand their interpretation of the results.

"to alert the others when they’d found food or poison" : From this I gather that they were known to light up on both occasions.

Couldn't the "cheater" robot just be signalling that there was poison by lighting up? The concept of "hero" robots also eludes me. I think it's a rather silly claim to make.
They seem to be simply differences in the code compared to the rest of the population. A random mutation in fact. It's not like theyre saying the robot could "choose" to lie or not. Being deceptive or heroic implies a conscious choice, which these robots obviously didn't have. Absolute conjecture and personification on the part of the researchers who made these claims, or the media is just emphasising what they said in jest.

Or I just don't understand. Or a little of both :D




1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users