Posted on 03/01/2016 9:48:31 AM PST by Faith Presses On
When it comes robots, humans can be a little too trusting. In a series of experiments at Georgia Tech that simulated a building fire, people ignored the emergency exits and followed instructions from a robot -- even though they'd been told it might be faulty.
The study involved a group of 42 volunteers who were asked to follow a "guidance robot" through an office to a conference room.They weren't told the true nature of the test.
The robot sometimes led participants to the wrong room, where it did a couple of circles before exiting. Sometimes the robot stopped moving and a researcher told the participants it had broken down.
You might expect those problems to have dented any trust people had in the robot, especially in the event of a life-or-death situation. But apparently not.
(snip)
His research began when he became interested in how robots could help humans during emergencies. The study was sponsored in part by the Air Force Office of Scientific Research (AFOSR), which is interested in the same question.
The researchers originally wanted to determine whether people would trust emergency and rescue robots. After seeing these results, a better issue to explore might be how to stop people trusting robots too much.
(Excerpt) Read more at computerworld.com ...
Anyone who has listened to what happens to people who blindly follow GPS directions knew this a long time ago.
No doubt brought to you by the same fine folks that want you to ride in and own an autonomous car.
The test is not accurate, because the participants were merely instructed to “follow the robot”. They didn’t know *why*. Had they been told “for the purposes of this test, the building is on fire. Follow the robot to safety.” Given that alternate set of instructions, the robot would likely have been walking alone pretty quickly after the first wrong turn.
My GPS used to instruct me to get off the freeway at a particular exit, go 500 yards down the road, turn around and get back on the freeway. I ignored it.
Just like Climate Change computer models!!!
Not only scary ... and SAD!
Older people often lose their sense of smell.
VERY IMPORTANT!
Added: The conditions were “follow the robot to a conference room”. So the participants could well have assumed that the test was of a robot that would provide hosting/receptionist duties.
I know in sailing it’s known as ‘GPS assisted grounding’.
I’m sure the robots are happy about these results.
It’ll make it that much easier for them to take over when the time comes.
Bender is up to his old tricks again.
DANGER....WILL ROBINSON!
IBTWWR
In before the Warning Will Robinson
perhaps not, d’uh
In downtown Seattle I have seen several times where a homeless person walks out into the street against the light and other people in suits see them walking and follow them into traffic.
Reminds me of land navigation course at Fort Benning. They give you an azimuth (direction) and distance. You plot the destination on a map, then go to it. The destination could be a hill top on the other side of a pond. You can see the hilltop but if the azimuth crosses a pond or a swamp, some will follow it blindly instead of walking around the pond or a swamp.
People will follow stupid people too.
Robot lives matter
Right. Most people trust program-controlled machines too much. (GPS, robots, computers). Some people (not me) are fascinated and awed by the mere mention of the word “computer” and invest it with untold intelligence and authority. (not me).
If the damned robot can’t be trusted, then why the hell do you use it to lead people out of a burning building? eh? who’s stupid now?
If you’re going to use it to lead people out of a burning building, then it has got to be reliable - or you don’t use it! Use a dog instead!
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.