Posted on 05/06/2018 8:14:51 PM PDT by BenLurkin
Self-driving vehicles have been proposed as a solution for the rapidly increasing number of fatal traffic accidents, which now claim a staggering 1.3 million casualties each year.
While we have made strides in advancing self-driving technology, we have yet to explore at length how autonomous vehicles will be programmed to deal with situations that endanger human life, according to a new study published in Frontiers in Behavioral Neuroscience.
To understand how self-driving cars might make these judgments, the researchers looked at how humans deal with similar driving dilemmas.
When faced with driving dilemmas, people show a high willingness to sacrifice themselves for others, make decisions based on the victim's age and swerve onto sidewalks to minimize the number of lives lost.
...
Automated vehicles will eventually outperform their human counterparts, but there will still be circumstances where the cars must make an ethical decision to save or possibly risk losing a human life.
...
Bergmann recognized that the majority of people would not approve of decisions the cars made if they followed the ethics commission decisions.
'If autonomous vehicles abide with guidelines dictated by the ethics commission, our experimental evidence suggests that people would not be happy with the decisions their cars make for them."
(Excerpt) Read more at dailymail.co.uk ...
If Honda comes out with a Santeria model, I’m not buying it!
“...and swerve onto sidewalks to minimize the number of lives lost.”
Or maximize, depending on one’s ‘religion’.
Coming soon a new Toyota model: The Kamikaze!
Are self driving cars able to drive in bad weather, especially snow and ice? Do they have the capability to perceive changes in road and weather conditions ahead?
How do they know to stop for traffic lights and stop signs? what if there is no stop sign but the word “STOP” is just painted on the road?
What if the car is supposed to stop for road workers who are flagging and directing traffic?
I’m sure the engineers and programmers have thought of these things, but, will the self driving cars really react better than humans would to unusual situations?
It will be a big seller in the Mid East.
Isaac Asimov’s “Three Laws of Robotics”
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
1. A robot will not harm authorized Government personnel but will terminate intruders with extreme prejudice.
2. A robot will obey the orders of authorized personnel except where such orders conflict with the Third Law.
3. A robot will guard its own existence with lethal antipersonnel weaponry, because a robot is bloody expensive.
The basic philosophical problem is that a self driving car only obeys the interests of its programmers, not its owner. The practical problem is even worse; self driving cars can and will be taken over at any time by government or even more sinister forces.
When self driving cars are mandated, liberty ends.
NO.
Nobody in their right mind would buy such a car.
Only if they are liberals.
Isaac Asimov’s “Three Laws of Robotics”
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Yeah, just ask Michael Hastings.
The Dreaded Blue Screen of Death
That's why the is a push by the "global elites" to make them mandatory, just like with electric cars.
“Nobody in their right mind would buy such a car.”
Exactly, because all I have to do to murder you is to step in front of your car.
regarding the annual number of fatalities - deport all illegals and the driving fatalities and injuries will decrease some. Make other choices once that is done.
I also think self driving cars are not safe and likely more of hazard. I’ve witnessed them breaking unnecessarily and almost causing accidents twice. whether the cars fault or the person riding inside actaully did some thing, either way it was a problem. I try to be no where near them when I see them on the road.
I thought part of the idea of self-driving cars is that they could all network with one another to make coordinated decisions about how to react to avoid accidents or react in a way uncoordinated drivers cannot.
No, I’m not big on the self-driving car idea. Someone could hack a car or hack Vehicle Traffic Control to create mischief with tragic consequences.
I’m still curious as to how they navigate tunnels and parking garages.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.