Posted on 06/30/2016 9:18:54 PM PDT by BenLurkin
Joshua D. Brown, of Canton, Ohio, died in the accident May 7 in Williston, Florida, when his cars cameras failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and didnt automatically activate its brakes, according to government records obtained Thursday.
Frank Baressi, 62, the driver of the truck and owner of Okemah Express LLC, said the Tesla driver was playing Harry Potter on the TV screen at the time of the crash and driving so quickly that he went so fast through my trailer I didnt see him.
It was still playing when he died and snapped a telephone pole a quarter mile down the road, Baressi told The Associated Press in an interview from his home in Palm Harbor, Florida. He acknowledged he couldnt see the movie, only heard it.
Tesla Motors Inc. said it is not possible to watch videos on the Model S touch screen. There was no reference to the movie in initial police reports.
(Excerpt) Read more at washingtonpost.com ...
and besides,...what could go wrong?
Another Tesla “autopilot” crash:
http://electrek.co/2016/05/26/tesla-model-s-crash-autopilot-video/
Another case when the software systems failed, but an alert driver could have handled the situation easily.
Driver’s comment: “My bad.”
Where’s your “Voice of the Customer” now, GM? In fact, your Voice of the Customer was never demanding autonomous vehicles. Mary Barra, GM’s first-ever female Chairman & CEO, doesn’t listen to, nor does she want to hear the voice of the consumer. Instead, she listens to the “Voice of The Regime”!
But, It burned absolutely zero gas in that final quarter mile!
This is because some a$$holes think they can write a program that replaces the human mind. It’s not possible.
This posting will make no difference and idiots will go ahead with “self-driving “ cars.
Watch out for me. ;)
I hate my 2000-mile twice-yearly drive. Looking forward to some new tech being a great help.
Let’s write this right... A trucker at an uncontrolled intersection missed that there was an oncoming car with its daytime headlights on, and turned in front of it. Neither the Tesla driver nor the Tesla braked before crashing into the trailer, shearing off the top of the car. The car came to rest when it impacted with a telephone pole further down the road.
The trucker who missed the oncoming traffic before he blocked the road could still somehow identify the sounds of a Harry Pottery movie over the sounds of a cat impacting the trailer and most of the car continuing under the trailer and out the other side.
The only significant change from the age old story of a negligent trucker who blocked traffic with his rig is that the Tesla cameras could not detect the trailer across the lanes of traffic which it falsely identified as an unimportant horizon.
My question is didn’t the systems recognize the cab turning across the path before interpreting the white trailer as empty space and apply the brakes and alerting the driver?
Apparently not. When I first heard of these systems I predicted some catastrophic failures. This is the first one I've heard about.
There are just too many variables to account for. And a human can instantly understand the context where a computer is hopelessly slow.
The globalists would like nothing better than to not have to pay drivers of semi-trailers and taxis.
While I am personally sad for the driver's tragedy, the death might wake up people to the true dangers of these auto-pilot driving contraptions.
Tesla driver using Autopilot feature killed by tractor trailer
http://www.freerepublic.com/focus/f-chat/3445294/posts
The Tesla's driver had posted at least one accident avoidance video before, so I suspect that there's likely a full video of the accident. A police beat article at the time of the collision said 'charges were pending', so honestly, all of this really reads like smoke to cover the trucker and the company.
It does certainly point out that the system is not invulnerable. But considering the wide variety of similar incidents across the US highway system of this exact type of accident, I'm not sure if it was preventable when a trucker negligently crosses on coming traffic. Like all newer vehicles, the Tesla's daytime running lights are always on.
Shame the driver apparently had incredible hearing to hear and identify a Harry Potter movie as the car crashed under and continued on after the top of the car was sheered off, yet didn't have average vision to see the oncoming car.
I suspect this was after the car came to a complete stop and he went over to check on the driver..
No he was watching Harry Potter
I always thought driverless cars would lead to this.
Dumbest idea ever.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.