software bug
I remember when software bugs just meant you had to restart your computer, now days they crash your car killing you...
The Ohio State Police made a movie called “Signal Seven” that was used for driver’s education when I went to High School in Pennsylvania. Gruesome photos designed to scare us into driving safely. Mostly worked. I guess they have another entry.
failed to distinguish the white side of a turning tractor-trailer
The car expected ALL tractor-trailers to be painted white?
he went so fast through my trailer I didnt see him.
Through my trailer?
he died and snapped a telephone pole a quarter mile down the road
A quarter mile down the road, so, what killed the driver?
Usually, if a self driving car gets into an accident, there are rapidly issued disclaimers saying: It was the fault of the driver in the other vehicle refusing to follow the law, not the self driving vehicle. I don’t hear them saying that yet.
This may be an unfortunate learning moment that will have to be considered from now on. I thought the car would have had sensors on the outside to detect the oncoming presence of the truck. With this death, the prospect of self driving autos has suddenly become serious.
how’s the car?
One of those Apollo 1 moments where scientists say “well, we hadn’t considered that...”
Yeah, no thanks. I’ll do the driving if you don’t mind.
Okay technologists ... this wonderful visionary pipe dream of automated traffic fleets zipping around a dense urban area or even an area without decent roads gets its first high profile lawsuit (that Tesla the company is liable for) to bring everyone back to Earth as to why this is not going happen for a very long time (without completely rebuilding the road system to support it).
later
I have coined a new term to describe why I don’t think self-driving cars are technologically feasible or desirable to any safe, long-time,high performance independent driver who seeks personal autonomy and freedom to travel at will to anywhere they desire. That term is: Technological Hubris. It comes my observation that most scientists and new engineers are from a decidedly left wing philosophical bent. They think as a part of their philosophy that people and technology are perfectible.
I have lived in northern Michigan for just shy of forty years, and can say with the utmost confidence that engineers can’t program a vehicle to anticipate the unpredictability of other people, vehicles, or even animals. One time. on my way to Boyne City from Charlevoix, a doe leapt clear over the hood of my girlfriends Corvette, to only leave a mere superficial scratch on the paint of the left fender. How do you program a vehicle with it’s numerous sensors that will compute that uncertainty/unpredictability factor into an inanimate autonomous vehicle?
Here is a good collection of Tesla auto-pilot stories that liberal lefty geeks like to share about technology over at Ars Technica. I generally like their content, but they are massively skewed leftist.
http://arstechnica.com/search/?ie=UTF-8&q=tesla+auto+pilot
Radio report said the driver was a former SEAL
A lot of us have been pointing out the big pocket libility for driverless cars is going to put this idea out of business. Here is a prime example.
First of all, this wasn’t a “self driving” car.
All these driver assistance features (e.g. autonomous braking) that are now available are no substitute for an alert driver.
I’m sure if the guy actually had read his owner’s manual, it would have told him exactly that. I know that’s expecting quite a lot.
Sorry the guy is dead, but he clearly wasn’t paying attention to what he was doing. Too bad he was too busy with other stuff to see the truck that the car’s sensors missed.
And I think that stuff about the white side of the truck being indistinguishable from the sky is BS. The autonomous braking systems that I am familiar with rely on radar, not light. More likely, the Tesla was going too fast for the systems to work sufficiently well to save this moron’ life.
and besides,...what could go wrong?
Another Tesla “autopilot” crash:
http://electrek.co/2016/05/26/tesla-model-s-crash-autopilot-video/
Another case when the software systems failed, but an alert driver could have handled the situation easily.
Driver’s comment: “My bad.”
Where’s your “Voice of the Customer” now, GM? In fact, your Voice of the Customer was never demanding autonomous vehicles. Mary Barra, GM’s first-ever female Chairman & CEO, doesn’t listen to, nor does she want to hear the voice of the consumer. Instead, she listens to the “Voice of The Regime”!
But, It burned absolutely zero gas in that final quarter mile!
This is because some a$$holes think they can write a program that replaces the human mind. It’s not possible.
This posting will make no difference and idiots will go ahead with “self-driving “ cars.