Free Republic
Browse · Search
General/Chat
Topics · Post Article

Skip to comments.

Robot with “morals” makes surprisingly deadly decisions
yahoo.com ^ | 7 hours ago | Rob Waugh –

Posted on 09/18/2014 9:43:29 AM PDT by BenLurkin

The so-called ‘Ethical robot’, also known as the Asimov robot, after the science fiction writer whose work inspired the film ‘I, Robot’, saved robots, acting the part of humans, from falling into a hole: but often stood by and let them trundle into the danger zone.

The experiment used robots programmed to be ‘aware’ of their surroundings, and with a separate program which instructed the robot to save lives where possible.

Despite having the time to save one out of two ‘humans’ from the 'hole', the robot failed to do so more than half of the time. In the final experiment, the robot only saved the ‘people’ 16 out of 33 times.

The robot’s programming mirrored science fiction writer Isaac Asimov’s First Law of Robotics, ‘A robot may not injure a human being or, through inaction, allow a human being to come to harm.’

The robot was programmed to save humans wherever possible: and all was fine, says roboticist Alan Winfield, least to begin with.

(Excerpt) Read more at uk.news.yahoo.com ...


TOPICS: Weird Stuff
KEYWORDS:
Navigation: use the links below to view more comments.
first 1-2021-33 next last
Misleading headline -- but interesting article nonetheless
1 posted on 09/18/2014 9:43:29 AM PDT by BenLurkin
[ Post Reply | Private Reply | View Replies]

To: BenLurkin

Robots don’t have morals. “Morals” is a human concept. A computer program can SIMULATE some aspects of human behavior, and humans are often so stupid that they think that the program actually exhibits human behavior.


2 posted on 09/18/2014 9:45:29 AM PDT by I want the USA back (Media: completely irresponsible. Complicit in the destruction of this country.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: BenLurkin

Possibly just a very perceptive robot who could spot morons from a mile away.


3 posted on 09/18/2014 9:45:50 AM PDT by cripplecreek ("Moderates" are lying manipulative bottom feeding scum.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: BenLurkin

Maybe they were using democrats, and the robots didn’t recognize them as human...


4 posted on 09/18/2014 9:48:24 AM PDT by Mr. K (Palin/Cruz 2016)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Mr. K

The robots were trying to improve the gene pool.


5 posted on 09/18/2014 9:48:57 AM PDT by glorgau
[ Post Reply | Private Reply | To 4 | View Replies]

To: BenLurkin

Looks like they just didn’t program them to finish the task. They successfully programmed dithering, I suppose that’s an accomplishment.


6 posted on 09/18/2014 9:49:31 AM PDT by discostu (We don't leave the ladies crying cause the story's sad.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: discostu

Dither-bots

Sounds like they put them into no win situations where the couldn’t choose 1 person over another.


7 posted on 09/18/2014 9:52:51 AM PDT by cripplecreek ("Moderates" are lying manipulative bottom feeding scum.)
[ Post Reply | Private Reply | To 6 | View Replies]

To: BenLurkin

simple programming failure - the programmer didnt’ apply a process for prioritizing, and the robot, not having clear instructions, kept looping from person a to person b.


8 posted on 09/18/2014 9:53:54 AM PDT by camle (keep an open mind and someone will fill it full of something for you)
[ Post Reply | Private Reply | To 1 | View Replies]

To: BenLurkin; GraceG; Zionist Conspirator; CtBigPat; Norm Lenhart; TADSLOS

9 posted on 09/18/2014 9:55:22 AM PDT by KC_Lion (Build the America you want to live in at your address, and keep looking up.- Sarah Palin)
[ Post Reply | Private Reply | To 1 | View Replies]

To: camle

Indeed.

The early Phalanx missile defense systems on US warships had the same problem with their programming.

Now they are programmed to “flip a coin”, all other things being equal.

Or so I have been told.


10 posted on 09/18/2014 9:56:01 AM PDT by BenLurkin (This is not a statement of fact. It is either opinion or satire; or both.)
[ Post Reply | Private Reply | To 8 | View Replies]

To: I want the USA back

Bingo. It feels nothing. It will only kill if programmed to do so by “executing” a set of commands having no awareness of what it is doing. Some people have been watching too much science fiction...: )


11 posted on 09/18/2014 10:00:34 AM PDT by jsanders2001
[ Post Reply | Private Reply | To 2 | View Replies]

To: glorgau

12 posted on 09/18/2014 10:00:37 AM PDT by Tijeras_Slim
[ Post Reply | Private Reply | To 5 | View Replies]

To: BenLurkin

More like “Programmers Write Bad Code for Ethical Robot”.


13 posted on 09/18/2014 10:01:37 AM PDT by Wolfie
[ Post Reply | Private Reply | To 1 | View Replies]

To: I want the USA back

Pretty much. Who in their right mind would trust a machine programmed by humans?? I sure wouldnt...


14 posted on 09/18/2014 10:03:37 AM PDT by 556x45
[ Post Reply | Private Reply | To 2 | View Replies]

To: BenLurkin

I suspect that the ‘dithering’ is the result of the “save humans” program completely re-evaluating the environment and seeing that another ‘human’ was in trouble and that it had enough time to get him, too... then it re-evaluated the environment and saw the fist human and that it had enough time to save him...

IOW, the /real/ problem here is a sort of failure to hold on to “partial solutions”, it would be interesting if they added in a valuation (number of savable people) and used a strict greater-than for the comparative function — that way once it “makes up its mind” to save someone it continues on that task until/unless it spies a greater number of people it can save.


15 posted on 09/18/2014 10:03:44 AM PDT by OneWingedShark (Q: Why am I here? A: To do Justly, to love mercy, and to walk humbly with my God.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: BenLurkin

Ping


16 posted on 09/18/2014 10:04:28 AM PDT by Jonty30 (What Islam and secularism have in common is that they are both death cults)
[ Post Reply | Private Reply | To 1 | View Replies]

To: BenLurkin

They’re not yet using positronic brains.

That’s the problem.


17 posted on 09/18/2014 10:06:16 AM PDT by Da Coyote
[ Post Reply | Private Reply | To 1 | View Replies]

To: Wolfie

It's not a bug, it's a feature.

18 posted on 09/18/2014 10:06:53 AM PDT by Tijeras_Slim
[ Post Reply | Private Reply | To 13 | View Replies]

To: camle

Lots of interesting problems that few robots could solve to a humane satisfaction.

If you program the robot to save only the people wearing a blue shirt, it will do it successfully every time until you shake things up by sending two blue shirted people into danger.

A human will make a completely different set of assessments and calculations. A human is going to consider saving both and formulate a quick plan. A human will consider risking one to save the other with an eye on going back to save the one he risked. Then there are the situations where a human will try to save the one (like a child) with little hope of survival on purely altruistic grounds.


19 posted on 09/18/2014 10:08:21 AM PDT by cripplecreek ("Moderates" are lying manipulative bottom feeding scum.)
[ Post Reply | Private Reply | To 8 | View Replies]

To: Da Coyote

They’re not yet using positronic brains.

That’s the problem.

Democrats have been using Negatronic brains
for years...


20 posted on 09/18/2014 10:09:30 AM PDT by tet68 ( " We would not die in that man's company, that fears his fellowship to die with us...." Henry V.)
[ Post Reply | Private Reply | To 17 | View Replies]


Navigation: use the links below to view more comments.
first 1-2021-33 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
General/Chat
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson