There are two major issues that weigh against Uber here:
1. The "safety backup" driver was not paying attention.
2. All the reports I've seen indicate the brakes were never applied. This is a huge problem for Uber and the designer of the self-driving system. Even if the vehicle could not have stopped in time to avoid striking the pedestrian, the brakes should have been applied.
The woman would have been seen by a human driver. Thats when I slow my speed and move into the other lane until I can figure out what the pedestrian is doing
On the other hand maybe we could launch these in SF when the bicyclists are out
Oh I imagine Uber's going to get taken to the cleaners in litigation over this. As I said on another thread, I'll be some "suit" at Uber decided that FLIR wasn't worth the expense to include in a self-driving car. Had it had one, the vehicle probably would have detected the bicycle-lady well before she stepped in its path and avoided hitting her.
Well, now Uber may go under as a result of this.
All the reports I’ve seen indicate the brakes were never applied. This is a huge problem for Uber and the designer of the self-driving system. Even if the vehicle could not have stopped in time to avoid striking the pedestrian, the brakes should have been applied.
—
Well, yes. Obviously the brakes should have been and would have been applied if the system had recognized the object as a person.
The part that’s difficult to comprehend is the driverless system is nothing more than a giant algorithm. It doesn’t know about actual objects like roads or people, squirrels darting across a road or anything.
You look at a person and you see a person. You know it’s a person. You know you shouldn’t run over a person while driving.
The driverless car receives millions of data points on its GPS location. Pixels of light from its camera system. Radar pulses from its onboard LIDAR. Then using only the algorithms, as it is not conscious, the algorithms need to plot what to do in the next split second. No problem as computers are super fast.
But how does it digest all the raw information and interpret it? That’s the trick. The system has to discount shadows, identify other cars (pretty easy, they’re large and radar reflective), and other moving things.
It’s those other moving things that are tricky. You know you’d avoid running over a squirrel if you could help it, but you would run over a squirrel if by swerving you’d hit another car or a tree. That’s all obvious to a thinking person. The machine doesn’t know any of this except by training for all the possible scenarios.
This woman on a bike is one of those scenarios that was not anticipated. A good driver might have automatically given a bike a wider berth just in case the bike rider wobbled or veered. A good driver would do this because they have a knowledge of how bike riders operate. The computer program likely saw nothing more than an object that was originally vectored on a course that would not interact with the vehicle. The code didn’t “know” enough to provide a margin for error.
At this point, we’re at the Windows 1.0 level of driverless vehicles.
Example....someone is in a parking lot....there are no crosswalks....so I can hit him??
And remember, this car had not one....but two drivers. One was dozing...the other (the magic car), all his life, wanted a bike.
The narratives I’ve read suggested that she bolted out from between cars but the video I saw showed her 2nd lane over casually strolling across the avenue.
I’m surprised that the sensors didn’t “see” her.
1. The safety backup driver was not paying attention.
This shouldnt matter - the problem with all these semi self driving systems is shortly after having some confidence in the system, the human backup will cease paying attention just about every time.
2. All the reports Ive seen indicate the brakes were never applied. This is a huge problem for Uber and the designer of the self-driving system. Even if the vehicle could not have stopped in time to avoid striking the pedestrian, the brakes should have been applied.
Right. In fact, the car probably should have been able to avoid the woman (who bears a lot of fault for obviously not paying attention to traffic at all) since its LIDAR should have seen her even in the dark. Im waiting to hear the explanation for the LIDAR not functioning as designed.