Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: Former Proud Canadian
There's no question that the pedestrian is primarily at fault here -- probably a minimum of 60% liable if this were a case in litigation.

There are two major issues that weigh against Uber here:

1. The "safety backup" driver was not paying attention.

2. All the reports I've seen indicate the brakes were never applied. This is a huge problem for Uber and the designer of the self-driving system. Even if the vehicle could not have stopped in time to avoid striking the pedestrian, the brakes should have been applied.

3 posted on 03/22/2018 9:35:58 AM PDT by Alberta's Child ("I saw a werewolf drinking a pina colada at Trader Vic's.")
[ Post Reply | Private Reply | To 1 | View Replies ]


To: Alberta's Child

The woman would have been seen by a human driver. That’s when I slow my speed and move into the other lane until I can figure out what the pedestrian is doing

On the other hand maybe we could launch these in SF when the bicyclists are out


22 posted on 03/22/2018 10:05:51 AM PDT by Nifster (I see puppy dogs in the clouds)
[ Post Reply | Private Reply | To 3 | View Replies ]

To: Alberta's Child
There's no question that the pedestrian is primarily at fault here -- probably a minimum of 60% liable if this were a case in litigation.

Oh I imagine Uber's going to get taken to the cleaners in litigation over this. As I said on another thread, I'll be some "suit" at Uber decided that FLIR wasn't worth the expense to include in a self-driving car. Had it had one, the vehicle probably would have detected the bicycle-lady well before she stepped in its path and avoided hitting her.

Well, now Uber may go under as a result of this.

30 posted on 03/22/2018 10:14:41 AM PDT by COBOL2Java (The bigger the government, the smaller the citizen)
[ Post Reply | Private Reply | To 3 | View Replies ]

To: Alberta's Child

All the reports I’ve seen indicate the brakes were never applied. This is a huge problem for Uber and the designer of the self-driving system. Even if the vehicle could not have stopped in time to avoid striking the pedestrian, the brakes should have been applied.

Well, yes. Obviously the brakes should have been and would have been applied if the system had recognized the object as a person.

The part that’s difficult to comprehend is the driverless system is nothing more than a giant algorithm. It doesn’t know about actual objects like roads or people, squirrels darting across a road or anything.

You look at a person and you see a person. You know it’s a person. You know you shouldn’t run over a person while driving.

The driverless car receives millions of data points on its GPS location. Pixels of light from its camera system. Radar pulses from its onboard LIDAR. Then using only the algorithms, as it is not conscious, the algorithms need to plot what to do in the next split second. No problem as computers are super fast.

But how does it digest all the raw information and interpret it? That’s the trick. The system has to discount shadows, identify other cars (pretty easy, they’re large and radar reflective), and other moving things.

It’s those other moving things that are tricky. You know you’d avoid running over a squirrel if you could help it, but you would run over a squirrel if by swerving you’d hit another car or a tree. That’s all obvious to a thinking person. The machine doesn’t know any of this except by training for all the possible scenarios.

This woman on a bike is one of those scenarios that was not anticipated. A good driver might have automatically given a bike a wider berth just in case the bike rider wobbled or veered. A good driver would do this because they have a knowledge of how bike riders operate. The computer program likely saw nothing more than an object that was originally vectored on a course that would not interact with the vehicle. The code didn’t “know” enough to provide a margin for error.

At this point, we’re at the Windows 1.0 level of driverless vehicles.


37 posted on 03/22/2018 10:18:19 AM PDT by Flick Lives
[ Post Reply | Private Reply | To 3 | View Replies ]

To: Alberta's Child
Not too many cases where a pedestrian is liable....not even a little bit.

Example....someone is in a parking lot....there are no crosswalks....so I can hit him??

And remember, this car had not one....but two drivers. One was dozing...the other (the magic car), all his life, wanted a bike.

47 posted on 03/22/2018 10:30:37 AM PDT by Sacajaweau
[ Post Reply | Private Reply | To 3 | View Replies ]

To: Alberta's Child

The narratives I’ve read suggested that she bolted out from between cars but the video I saw showed her 2nd lane over casually strolling across the avenue.

I’m surprised that the sensors didn’t “see” her.


61 posted on 03/22/2018 11:03:10 AM PDT by rockrr (Everything is different now...)
[ Post Reply | Private Reply | To 3 | View Replies ]

To: Alberta's Child

“1. The “safety backup” driver was not paying attention.”

This shouldn’t matter - the problem with all these “semi self driving” systems is shortly after having some confidence in the system, the “human backup” will cease paying attention just about every time.

“2. All the reports I’ve seen indicate the brakes were never applied. This is a huge problem for Uber and the designer of the self-driving system. Even if the vehicle could not have stopped in time to avoid striking the pedestrian, the brakes should have been applied.”

Right. In fact, the car probably should have been able to avoid the woman (who bears a lot of fault for obviously not paying attention to traffic at all) since its LIDAR should have seen her even in the dark. I’m waiting to hear the explanation for the LIDAR not functioning as designed.


91 posted on 03/22/2018 12:31:38 PM PDT by PreciousLiberty (Make America Greater Than Ever!)
[ Post Reply | Private Reply | To 3 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson