Posted on 09/18/2014 9:43:29 AM PDT by BenLurkin
The so-called Ethical robot, also known as the Asimov robot, after the science fiction writer whose work inspired the film I, Robot, saved robots, acting the part of humans, from falling into a hole: but often stood by and let them trundle into the danger zone.
The experiment used robots programmed to be aware of their surroundings, and with a separate program which instructed the robot to save lives where possible.
Despite having the time to save one out of two humans from the 'hole', the robot failed to do so more than half of the time. In the final experiment, the robot only saved the people 16 out of 33 times.
The robots programming mirrored science fiction writer Isaac Asimovs First Law of Robotics, A robot may not injure a human being or, through inaction, allow a human being to come to harm.
The robot was programmed to save humans wherever possible: and all was fine, says roboticist Alan Winfield, least to begin with.
(Excerpt) Read more at uk.news.yahoo.com ...
Robots don’t have morals. “Morals” is a human concept. A computer program can SIMULATE some aspects of human behavior, and humans are often so stupid that they think that the program actually exhibits human behavior.
Possibly just a very perceptive robot who could spot morons from a mile away.
Maybe they were using democrats, and the robots didn’t recognize them as human...
The robots were trying to improve the gene pool.
Looks like they just didn’t program them to finish the task. They successfully programmed dithering, I suppose that’s an accomplishment.
Dither-bots
Sounds like they put them into no win situations where the couldn’t choose 1 person over another.
simple programming failure - the programmer didnt’ apply a process for prioritizing, and the robot, not having clear instructions, kept looping from person a to person b.
Indeed.
The early Phalanx missile defense systems on US warships had the same problem with their programming.
Now they are programmed to “flip a coin”, all other things being equal.
Or so I have been told.
Bingo. It feels nothing. It will only kill if programmed to do so by “executing” a set of commands having no awareness of what it is doing. Some people have been watching too much science fiction...: )
More like “Programmers Write Bad Code for Ethical Robot”.
Pretty much. Who in their right mind would trust a machine programmed by humans?? I sure wouldnt...
I suspect that the ‘dithering’ is the result of the “save humans” program completely re-evaluating the environment and seeing that another ‘human’ was in trouble and that it had enough time to get him, too... then it re-evaluated the environment and saw the fist human and that it had enough time to save him...
IOW, the /real/ problem here is a sort of failure to hold on to “partial solutions”, it would be interesting if they added in a valuation (number of savable people) and used a strict greater-than for the comparative function — that way once it “makes up its mind” to save someone it continues on that task until/unless it spies a greater number of people it can save.
Ping
They’re not yet using positronic brains.
That’s the problem.
It's not a bug, it's a feature.
Lots of interesting problems that few robots could solve to a humane satisfaction.
If you program the robot to save only the people wearing a blue shirt, it will do it successfully every time until you shake things up by sending two blue shirted people into danger.
A human will make a completely different set of assessments and calculations. A human is going to consider saving both and formulate a quick plan. A human will consider risking one to save the other with an eye on going back to save the one he risked. Then there are the situations where a human will try to save the one (like a child) with little hope of survival on purely altruistic grounds.
Theyre not yet using positronic brains.
Thats the problem.
Democrats have been using Negatronic brains
for years...
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.