Posted on 11/26/2017 8:26:32 AM PST by markomalley
Self-driving cars may have to make the moral decision of who lives and who dies during a crash, according to a report.
As you approach a rise in the road, heading south, a school bus appears, driving north, one driven by a human, and it veers sharply toward you. There is no time to stop safely, and no time for you to take control of the car, USA Today explained. Does the car: A. Swerve sharply into the trees, possibly killing you but possibly saving the bus and its occupants? B. Perform a sharp evasive maneuver around the bus and into the oncoming lane, possibly saving you, but sending the bus and its driver swerving into the trees, killing her and some of the children on board? C. Hit the bus, possibly killing you as well as the driver and kids on the bus?
The moral dilemma has been heavily discussed with the advancement of self-driving vehicles.
According to USA Today, Azim Shariff, an assistant professor of psychology and social behavior at the University of California, Irvine, co-authored a study last year that found that while respondents generally agreed that a car should, in the case of an inevitable crash, kill the fewest number of people possible regardless of whether they were passengers or people outside of the car, they were less likely to buy any car in which they and their family member would be sacrificed for the greater good.'
(Excerpt) Read more at breitbart.com ...
Yes, different set of priorities. Do some vendors get more defensive algorithms? At a cost? Do some people get to override the algorithms? Can priorities be altered by law enforcement, emergency vehicles? How is liability established? How about hackers?
Who determines if one life is worth more than another?
“According to USA Today, Azim Shariff, an assistant professor of psychology and social behavior at the University of California”
Consider this... As a programmer I can possibly determine where you go, potentially gain access to other vehicles around yours, and using publically available data about you and others you may associate with on a regular basis i could make a reasonable guess about your political views. Now perhaps I could alter the rule base that controls you car and bias the outcome to sacrifice the vehicle and occupants or those that do not share my political views,
Next question... As a developer am I liable for any outcomes, or as an executive of a company am I liable for allowing a bias like this to escape from the lab into the wild?
Just some food for thought.
The answer is always to minimize the damage to the occupants of the vehicle it is controlling. The other vehicles will be doing the same; whether driven by a human or software. Buses are a specious argument. They are extremely tough. Unless knocked on their sides, there are few if any injuries in bus accidents to occupants of the bus. The driver is usually the only one at risk.
We have squirrels running across the road all the time. When my kids were learning to drive, I had to make it absolutely clear to them that their lives, and the lives in other cars who might happen to be on the road with them, are more important than squirrels.
Run over the squirrels, if that’s what they choose. They have a dreadful habit of waiting by the side of the road and then dashing in front of you at the last minute.
"I think you missed that part."
BUT, he brought up a very good point.
Taught my children the same thing. Don’t swerve for a dog, deer, cat, squirrel, etc. and wreck the car or kill someone by hitting them. Ever.
"I'm glad SOMEBODY caught the reference."
I don't get it.
I’ve watched a number of documentary pieces on the technology development, and the one factor pointed out which interests me....is that weather and road conditions will over-ride the programmed handling of the car. As one guy pointed out....if the car program determines that it’s too risky because of ice or snow....it’ll pull to the side of the road and end the ride. If this were to work, I’d take a guess that 90-percent of winter accidents would have been taken out of the statistical display. The same feature could determine weather patterns and refuse to enter areas where flooding would typically occurs.
Entropy.
Michael Hastings sure found that out!
....itll pull to the side of the road and end the ride. If this were to work, Id take a guess that 90-percent of winter accidents would have been taken out of the statistical display. The same feature could determine weather patterns and refuse to enter areas where flooding would typically occurs.
__________________
However the stats for people dying frozen in their cars and people dying while walking on the side of the road, as well as people mugged by ferals increases exponentially
Has anyone yet stopped to ask why we need driverless cars? Okay, why do we need them?
Meaningless.
If Hellary becomes POTUS, everyone who didn’t vote for her will be placed in the cars’ data causing a huge increase in Hastingscides.
This is America, it’s not about what we need it’s about what we WANT. And we have old people, people with long commutes, and people that just don’t enjoy driving that WANT this. Also cab companies, messenger companies, delivery companies, and trucking companies that think it’s a pretty sexy idea too.
And especially Elon Musk. He's in my car too.
I want a self driving car that does not depend on google or gps map software or outside sources to know how to drive.
self driving is self driving.
google dependent is NOT self driving.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.