Posted on 03/20/2018 3:24:24 AM PDT by grundle
Two days ago, a self driving Uber car crashed into and killed a woman in Arizona. The car had a backup human driver behind the wheel who had the ability to take control at any time. The woman who got killed was walking in the street but was not in a crosswalk.
In my opinion, the government should get a warrant from a judge to require Uber to release its camera footage of the collision to the public. As long as we don’t get to see the footage, we can only speculate as to who was at fault.
If it was in fact Uber’s fault, then the public has a right to know, and Uber should be required to pay $10 million to the family of the victim. (I also believe that anyone who fakes such an accident in order to commit insurance fraud should get 10 years in jail for insurance fraud, in addition to whatever punishment they get for killing someone.)
If it’s the pedestrian’s fault, then knowing this information would prevent people form mistakenly thinking that self driving cars are more dangerous than they actually are.
So far, Uber’s self driving cars have a death rate of one death per approximately 2 million miles. By comparison, human driven cars have one death for approximately every 100 million miles. These are just rough numbers – they are not exact. And the sample size for Uber’s self driving cars is too small. However, from what we know, so far, Uber’s self driving cars have a death rate per mile which is approximately 50 times that of human driven cars. If this death was the fault of the pedestrian, then it doesn’t give any reason to be afraid of self driving cars. But if the death is Uber’s fault, then it’s a sign that something is seriously wrong with Uber’s self driving cars, even though the sample size is small. In cases of life and death, even one death is too many when only 2 million miles have been driven. The sample size is small, but that doesn’t change the fact that a person is dead.
The SUV was speeding and made no attempt to brake. Why is the public being guinea pigs(to the death) to this not ready for real world deadly tech?
It should never be the fault of the pedestrian if a self driving algorithm can be programmed to avoid objects in their path. Which is essentially what the entire car is designed and engineered to do.
It should never be the fault of the pedestrian if a self driving algorithm can be programmed to avoid objects in their path. Which is essentially what the entire car is designed and engineered to do.
“Why is the public being guinea pigs(to the death) to this not ready for real world deadly tech?”
This is the same mentality that gave us the curlicue light bulbs. They are doing it because we are children who can’t decide what is best for us.
What was the speed limit?
The police say they have the video, and it does show the pedestrian.
The mini SUV was going 38 in a 35. The autonomous driving level was level 2, which means a driver was at the wheel and ready to take control.
We are told the video shows the woman stepped in to the path of a car from an unlit center medium at dusk. If there are 5 current cameras recording the event, including a recording of the driver, what incentive would there be for the police to lie?
I hope you never had an accident using cruise control.
Given that we have 40,000 or more deaths per year in auto fatalities in this nation alone, it is a given that we will continue having them as autonomous cars come into being. Maybe they get reduced. That is still an open question but there is good reason to believe that auto fatalities may overall be decreased in a world where cars drive themselves. However, I don't see us ever getting to zero fatalities or anywhere close to it. As one who drives the highways on a regular basis, I'm actually amazed that we don't have more fatalities. Same with commercial air travel.
No they are doing it for the money.
35 mph.
Who in God's name would be dumb enough to use cruise control in an area requiring a 35 MPH speed limit??
Other than Uber, apparently....
“We are told the video shows the woman stepped in to the path of a car from an unlit center medium at dusk. “
So the sensors can’t operate correctly in those conditions and the program can’t respond in a timely manner?
Generally, pedestrians have an IQ of about 50.
When people sit at a desk and focus on a problem they average an IQ of 100. But when their legs and minds start wandering in public, their IQ drops in half. Give ‘em a “smart” phone and the IQ drops by half again.
This situational obliviousness is exacerbated by far-left judges and trial lawyers who place zero burden of responsibility on individuals for their own well-being. Always someone else’s responsibility, usually the We the Taxpayers.
Woman who got hit sounds like a Darwin Award nominee.
I am not saying that the idea is bad. I am saying the tech clearly isn’t ready to be unleashed on the public. I live near an airport/military aviation base. 10 crashes in the vicinity in 20 years. Why? Fire fighting aircraft and air racing. I can live with that. A self driving car that can’t “see” me under certain conditions and is experimental is not acceptable.
According to this article the speed limit is 45 and the car was going 40.
The only way I would absolve the car would be if the woman were completely not seen before, for example jumping out from behind parked trucks into the pathway of the oncoming vehicle. Pushing a bike is slow movement and seems should definitely have been tracked.
I think this is the artilce where the 35mph comes from and the source is the police chief who says it appears Uber is not at fault.
Read the article in #19.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.