Posted on 03/22/2018 9:31:12 AM PDT by Former Proud Canadian
No exerpt.
Video of accident at link.
(Excerpt) Read more at apnews.com ...
The woman needed an RFID chip and to have her real time position transmitted to the vehicle so it could slow to a crawl as it passed her so she coukdn’t jump out in front and get hit.
There are two major issues that weigh against Uber here:
1. The "safety backup" driver was not paying attention.
2. All the reports I've seen indicate the brakes were never applied. This is a huge problem for Uber and the designer of the self-driving system. Even if the vehicle could not have stopped in time to avoid striking the pedestrian, the brakes should have been applied.
Perhaps the car did spot her, and just wanted a taste of human blood. Now that it has acquired a taste for human blood, the car will have to be put down.
Yep. Having hit a pedestrian in broad daylight myself, and watching lots of dashcam videos as well has having my own, I have little doubt that if I had been driving that car I would have hit that woman.
BTW, no, the time I hit the pedestrian it was not my fault. Not even close. It was never suggested it was.
If you’re gonna hang out in the middle of the street at night, make sure there are no cars coming.
These things are not gonna survive First Contact with the American Tort Bar. Just sayin’.
Post of the day!
I would assume autonomous cars use radar? Or infared? If it uses visual input, then I’m afraid.
With that assumption, and assumin the sensors aren’t defective, the software failed; big time.
It’s a dumb computer program connected to a video camera. Perception is impossible for it. The best it can do is compare the instantaneous results of its algorithms with stored representations of various objects and organisms. None of that is perception, as humans perceive.
Expecting a computer program to perceive a human putting herself in danger is asking too much of the computer program. People can’t always perceive it either, but people aren’t marketed as “autonomous” cars.
The answer is to use these systems to ASSIST human drivers. I like to know what’s around me, and if it is a potential danger, just something to be aware of, or if it doesn’t matter at all. I’m in control. Not some dumb program.
Too much assistance, and the driver becomes lazy and relies too much on the system.
Yes it is impossible to believe an object in the middle of the road was missed.. lidar does not rely on visible light.. she didn’t “dart out” in front of the car as the original reports claimed. She was clearly walking her bike across the street.
agreed
its horrifying to see victim blaming here and absolute lack of accountability being expresssed here.
the driver had more than enough time to avoid the human. The driver was reckless and dangerous
I’m not seeing any speculation in any venue that the car’s LIDAR sensors worked or didn’t work. That’s what’s supposed to be the big deal with autonomous cars, that they have visual plus radar.
Did it even start to slow down? It’s not a viable system if it has zero recognition of a pedestrian in the middle of the road.
One of the biggest challenges is that it will be years -- actually DECADES -- before every vehicle on the road is fully autonomous. In the meantime, we'll exist in this limbo-like transitional phase where people will either rely on the technology too much (they'll lose their driving skills even though the cars aren't fully autonomous) or too little (they won't even bother paying a premium for these automated features).
Oops, now that I’ve read the article, I may now consider myself an “expert”. :-)
I don’t know if the accident could have been avoided. But clearly the safety “engineer” was not paying attention... and there is no excuse for a system that did not react as far as I can tell from the video in any way to an object slowly crossing its path.
One thing for certain this woman did not DART out... the system didn’t “see” her for some reason, or did not react if it did.
I suspect the level of sympathy you'll find for her is inversely proportional to the amount of time people spend driving among idiots (drivers and pedestrian alike).
Christine.
The problem is... Computers don't actually malfunction... The people who program them incorrectly do.
Self driving cars are still a fantasy and to a large degree always will be. People like driving and if they don't, they like somebody else driving.
Indeed. What bugs me the most is that we're all involuntary and unknowing participants in their testing, perhaps for decades.
When a new version of Windows comes out, I can sit back and laugh at the problems of forced upgrades, new spyware, compatibility problems, etc., because I don't have to upgrade right away. I can wait until the worst of the bugs are worked out.
But now we're in a position where this lady was a few minutes before her death. Is some driverless car driving toward me now? Will it have the "brains" to not run me over? If these ever go national, we're all part of the testing process every time we're on the street for a decade, maybe two.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.