Posted on 05/24/2018 9:05:03 PM PDT by Moonman62
Blame the victim, got it. Say no more.
It’s the law that’s blaming the pedestrian, not me.
Your last sentence is the pièce de résistance. They thought they were saving money by doing testing in the street instead of in test areas with crash dummies.
Once again the evidence is in front of your face.
From the NTSB report above:
In this area, northbound Mill Avenue is separated from southbound Mill Avenue by a center median containing trees, shrubs, and brick landscaping in the shape of an X. Four signs at the edges of the brick median, facing toward the roadway, warn pedestrians to use the crosswalk. The nearest crosswalk is at the intersection of Mill Avenue and Curry Road, about 360 feet north of where the crash occurred.
It is certainly arguable that no human driver could have prevented that fatality. But that is irrelevant. If a human driver were texting, and in fact the assigned driver was equally distracted, he or she would have been assigned some percentage blame. Would it have been more than 50%? Probably. In any case Uber and its programmers must share the blame for the death.
No, it’s not the law. In all such cases the drivers and pedestrians are assigned percentages of blame. Your suggestion is that the pedestrian is 100% at fault here. That is not true. The reason it is not true is that the car made no effort to evade or stop.
...
I think you’re shifting to civil law.
As far as traffic law in AZ is concerned, it was her responsibility to yield to the car in that area.
Any system that supposedly does most of the job of driving, but requires a human to intervene at a (very short) moment’s notice to avoid disaster is not acceptable, as far as I’m concerned. Watching over a self-driving car closely, ready to take control at any moment sounds more tiring than just driving the car yourself. That goes for most of the semi-autonomous driving aids already in use, like Tesla’s “Autopoilot”, for example, which should probably be banned for the false sense of security they give their drivers.
Never mind edge cases, most of these things still have trouble determining what to do if the lane markings are not perfectly clear. That is to say, they’re not even capable of doing what the average (which is to say pretty poor) driver can do with very little thought.
I’m not willing to put my life, or others, in the hands of an automated driving system that has zero common sense.
Certainly applies to civil law. Criminal negligence by Uber will be harder to prove but is still plausible and should be pursued.
No doubt. Slam dunk they’d win. No braking whatsoever prior to impact and the vehicle had a human operator and very high tech automated system? And both failed to brake the vehicle. That’s brutal...
I hope they settled for millions.
Will they sue the car or the driver? Seems to me that this situation is going to be coming up many more times if they continue experimenting with these systems. The driver-less car deal seems like an answer to a question that wasn’t much asked.
Lots of times I need a word that I know, but these days my old memory is not working quite right, so keeping a thesaurus handy by will remind me or give a substitute. I meant no personal harm or pickiness.
I'd instruct my counsel to not only sue the driver, but the company pushing the driverless cars, and the county, city and the state of Arizona for allowing people on the streets to be used as guinea pigs.
Without trying to be contentious, I don't think you can say that. Sure, sober and alert should greatly diminish the risk, but from an actuarial point of view, there is a chance that this could happen, and an underwriter would take it into consideration.
But the point is, who pays for the incident (there are really no such thing as "accident"), the maker of the driverless car, the owner, or the pedestrian (for the damage to the car)?
That's going to be the conundrum.
The Volvo emergency breaking system was disabled while the Uber system was in use. The operator was supposed to be the Uber emergency breaking system. Back to the drawing board..
No offence taken, I share that old timer problem.
“That goes for most of the semi-autonomous driving aids already in use, like Teslas Autopoilot, for example, which should probably be banned for the false sense of security they give their drivers.”
Consumer Reports made that case after the first fatal autopilot crash in Florida a couple of years ago.
https://www.consumerreports.org/tesla/tesla-autopilot-too-much-autonomy-too-soon/
Wow...Clearly that is was the article says. I read it too.
But it doesn’t matter a whit if something was disabled or not disabled. The dead victim and their family certainly don’t care about that. The fact is the vehicle was equipped with a high end driverless system and a human operator on board, and both failed to brake prior to impact.
Clearly this driverless vehicle company and the county/state who allowed them to test on their highways using everyone on the streets as guinea pigs don’t give a damn about anything but $$.
Since I am not in the industry, I really don’t know how far along the tech is. If they can navigate a busy college campus during class changes I will give their technology a nod. I’ve been on buses during class changes. The bus drivers know how to dance around students on their smartphones. I haven’t heard of any college campus allowing driver less testing on their campuses.
Forget the sanitized academia campuses. Put them in an major international airport environment where half the people are hopelessly lost and the other half are paid professional drivers who want you out of their way.
Of course these driverless cars would have to carry body bags with them in that environment.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.