Posted on 06/26/2016 10:33:08 PM PDT by Hojczyk
Whether the choice was between their own car fatally crashing itself to save two, three, or ten pedestrians, "what we found was that the large majority of people strongly feel that the car should sacrifice its passenger for the greater good," says Bonifan. "Even when people imagined themselves in the car, they still say that the car should sacrifice them for the greater good. And even when people imagine being in a car with a family member or even with their own child, they still said the car should kill them for the greater good."
There is a big "but" coming. When given the option of hypothetically buying a self-driving car that's utilitarian (it saves the greatest number of people) or one that's selfish (programmed to save its passenger at all costs) people are quick to buy the selfish option. When it comes to utilitarian cars, "they tell us that it's great if other people get these cars, but I prefer not to have one myself," says Bonifan.
Economists call this feeling a social dilemma. It's a bit like how most people view paying taxes. Yeah, everyone should do it. But nobody is too keen on doing it themselves.
(Excerpt) Read more at popularmechanics.com ...
Curious how you might approach these ethical self-driving car scenarios? The scientists published an interactive website today for you to explore them.
The car should be programmed to allow the injury or death of as few people as possible. Such as hit a tree instead of running over a crosswalk full of kids.
What if those “kids” are holding chains, bats, bricks, and guns?
It may come to a point were people will not be allowed to drive. A robot car would never speed or drunk drive, itwould obey all traffic laws and would not be distracted by a phone, radio or screaming kid in the backseat, etc. So there would probably be a lot less accidents or deadly crashes to start with.
I’m missing something. I thought self driving cars were supposed to be able to stop, react more quickly than humans can, and avoid accidents and collisions. I find it hard to believe that the self driving car would find itself in such a dilemma.
But if they do have this dilemma, then these cars still have glitches.
1) better you than me.
2) what does it do when BLM or Mexicans block the street in front of you in a protest and you need to nose through...or even accelerate?
It will stop politely and you will get the Reginald Denny LA riot treatment.
I think self-driving cars, trucks, and ships are DOA once someone uses one to deliver explosives. Someone might not get their 72 virgins, but putting a hundred pounds of explosives into a car, strapping in a warm body, and setting destination, seems to easy for a budding Islamic terrorist to pass up. Maybe, they’d be the warm body. However it is done, I think the threat of self-driving cars delivering Islamic terrorist explosives is too much to allow them on the road or on the ocean.
Currently one of the biggest problems for self driving cars is trying to compensate for what unpredictable and irrational human drivers do around it. They don’t follow the rules.
...and a robot car won’t take you to places where the government doesn’t want you to go, won’t work at all if the government doesn’t want you to travel, and will keep a detailed log of everywhere you go, when you went there, and how long you stayed.
big brother is here.
The police shut down the roads now when they don’t want you to go somewhere. You currently need the governments permission to even drive.
It will also lock you in, or out.
"...A robot car would never speed or drunk drive, itwould obey all traffic laws and would not be distracted by a phone, radio or screaming kid in the backseat, etc..."
What about pedestrians who deliberately run in front of oncoming self driving cars? Why should the occupants of the car be the ones do die in that scenario? Why shouldn’t the car kill the pedestrian instead?
So that’s what happened to Michael Hastings. He was sacrificed to that others could lie ...I mean live...my bad...
I was thinking the government would be controlling the political demographics via vehicular “accidents”.
Planes fly and can even land on autopilot. Rockets land on barges with automation. Elevators run on automation and don’t dash their passengers to death except for an occasional malfunction. Commuter trains run on automation. Automated cars are coming.
Will there be a day where the state will require the control over how it’s programmed? How many ways might that power be abused?
Which is why i wont have a self driving car. My car doesnt decide if i live or die. Or my chances, or my instincts, or doing something beyond its programming.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.