Posted on 03/21/2018 12:48:17 PM PDT by SeekAndFind
When a driverless car kills someone, whos to blame?
Thats no longer a hypothetical question. A self-driving car operated by Uber struck and killed a woman on a street in Tempe, Arizona, on Sunday night, likely marking a grim milestone for the nascent technology: the first pedestrian killed by such a car on public roads.
Police say the 49-year-old woman was walking a bike across the street, outside the crosswalk, at around 10 p.m. The Uber was traveling at 40 miles per hour in autonomous mode, with an operator in the drivers seat, when she was hit. Police have not yet determined who was at fault. (The car apparently didnt slow down, and the operator didnt appear impaired.) Nonetheless, Uber immediately suspended its self-driving tests in Arizona and nationwide, as many in the tech industry reacted with alarm.
Theres an ongoing debate about legal liability when it comes to collisions in which an autonomous vehicle harms someone else through no fault of that person. Would the blame lie with the self-driving cars owner, manufacturer, a combination of the two, or someone else? In their quest to become the Mecca of self-driving cars, Arizona regulators have largely left those questions unanswered, The New York Times reported last year:
(Excerpt) Read more at newrepublic.com ...
Depending on what happened, it might be all her fault.
they picked a real winner to be the safety pilot. Thats why this stuff is never going to work. Our civilization, back in the 50s- they could have done it. With the flophouse we have now? Ummmmm ... no
I’m surprised car manufacturers are coming out with all these autonomous features. These are just lawsuits waiting to happen. Our Toyota has a lane departure assist feature, but it works pretty poorly. If someone depended on that, they would be in trouble. Some of the other features seem to work pretty well.
A Self-Driving Uber Killed a Woman. Whose Fault Is It?
___________________________________________
It’s the woman’s fault. Next question.
In case anyone was wondering, THIS is one of the real hurdles in the widespread adoption of self-driving cars. The technology is the easy part. The legal issues are a nightmare.
An interesting point, probably depends on state/local laws where it happened
"If a pedestrian is hit by a car, the driver of the car that hit the pedestrian is usually (but not always) considered to be at fault, even if the pedestrian was not in a crosswalk.
The reason for this is that most states negligence and traffic laws require drivers to be alert to what is around them and to pay attention to hazards in the road. A pedestrian certainly qualifies as a hazard in the road. In other words, drivers have a legal obligation to see and avoid what is there to be seen."
https://www.nolo.com/legal-encyclopedia/pedestrian-hit-by-car-legal-options.html
Did car have time to brake and avoid hitting victim?
A woman steps from the shadows in front of a car. Whose fault is it?
Depends a lot on how close, light conditions, toxicity results and other factors, doesn’t it.
Autonomous vehicles have multiple cameras and sensors that are all recorded. They will provide us with much more information than we usually have.
Now we know where the bike came from. She was not riding it, but was walking it.
We will get a lot more answers shortly.
I'm sure there is a huge exception to that standard when it comes to pedestrians crossing streets at night -- especially outside crosswalks and in areas where lighting is poor.
The gun, the NRA, Fed Ex and the package.
I would think the owner of the self-driving car is liable for any damages just as if the person was driving the car. That person’s machine killed someone. Then that owner’s insurance company would settle with the manufacturer of the self-driving car.
If governments were to mandate self-driving cars it could turn out — we ALL pay for the damages.
They are so safe they say. So make the manufacturer criminally liable. Suddenly they will stop making them.
No lawyer will become poor due to the autonomous vehicle craze.
Too many women out there don’t have a clue what’s going on around them. Us men’s have to protect them from themselves. That’s our first job.
The reason for this is that most states negligence and traffic laws require drivers to be alert to what is around them and to pay attention to hazards in the road. A pedestrian certainly qualifies as a hazard in the road. In other words, drivers have a legal obligation to see and avoid what is there to be seen."
That's what I always thought. When you or I are driving, I am always looking for hazards, especially errant pedestrians, bike riders, baby carraiges, balls spilling out on the street with children to follow.
I was taught to give pedestrians a very wide berth and to slow down when approaching. Neither of these things happened in this case. I think the programming has a long way to go in this area. I wonder if the people programming these cars are drivers themselves or just programmers?
Has it been . . . doctored?
“The technology is the easy part.”
I wouldn’t go that far :-). It’s not that easy. I wish it were, but it isn’t :-). For these systems to be practical, lots of custom hardware needs to be designed ... loads of ASICs and FPGAs ... plus the AI driving these cars is still in its infancy.
Now, is it easy compared to the torture watching a bunch of lawyers that barely know what they’re talking about fighting one another in court ... yeah, I’ll buy that :-) ... that makes designing this stuff a breeze in comparison :-).
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.