Posted on 12/22/2022 8:53:12 AM PST by SaxxonWoods
Aglitch in Tesla Inc’s (NASDAQ: TSLA) Full-Self Driving (FSD) caused an eight-vehicle crash on San Francisco's Bay Bridge, according to the driver involved.
(Excerpt) Read more at msn.com ...
No matter how good EVs may or may not be, they can't make this sound.
“No matter how good EVs may or may not be, they can’t make this sound.”
Wrong. Jump to 4:00
What do you drive?
The problem is that consciousness is actually a spiritual manifestation, and not purely physical. The brain that they’re so focused upon is only the physical mechanism that allows our spirit to interact with the physical world. Their belief is like thinking that the ventriloquist’s dummy is actually alive and conscious, and that if they only figure out the right grade of wood to use they can make one that is alive and can think and talk even without a ventriloquist behind it.
But just as the wooden dummy only presents the illusion that it is alive, so too does “AI” only present an illusion that it is intelligent and aware of its surroundings. AI will get better at this mimicry as computer processors continue to get faster with more parallel architecture, but it will never reach a point at which it can safely react to all of the randomness that occurs in the world, not even when limited to those things that can happen while driving a car. Sure, most of the time, even today, it can do a good job of driving (even better than humans at times), but when it inevitably fails it will fail spectacularly, and at those moments it will fail so badly that it will almost always have been better to have had a human in control. The reason is that no amount of programming can anticipate every possible situation, while human consciousness and intelligence can sort those situations out in ways that we still don’t completely understand. We often do things intuitively, especially in a crisis, that don’t make strict logical sense or that are based upon an incomplete picture of the world. The computer’s downfall is that it ALWAYS follows a logical sequence according to its programming, which sometimes dooms it to failure, along with any human whose life is depending upon its decision making.
From the article:
“The police report noted that the Tesla vehicle made an unsafe lane change and was slowing to a stop which led to another vehicle ramming it resulting in a cascade of crashes”
No doubt the self driving isn’t quite there yet but how is the self driving Tesla at fault for some jerk rear ending it?
And just to think that they are planning on putting driverless Semis out there on the road.
But only Tesla is calling it “Full Self Driving”.🙄
How many people can control a vehicle going that fast?🤨
And another tech FanBoi in the mix.
Don’t care.
That’s absolutely fantastic.
Maybe TexasGator and his BFFs, Ron and Elon, can collaborate on a project and have super duper, 9000hp, EVs make that noise.
It was the SF/Oakland Bay Bridge, which has one-way traffic on two decks. The GGB is less safe, having two-way traffic on a single deck separated by a temporary divider that is shifted daily for rush hour conditions. Used to be lots of head-on crashes before they installed the temporary divider. I always use the right lane on the GGB and stay away from the center lanes. As for the Bay Bridge, the side lanes are worst because of traffic slowing to exit at the Yerba Buena/Treasure Island exits.
Sure. Sure. Let’s take down Tesla and, in turn, Musk.
It wasn't FSD, it was autopilot just like you have on ICE vehicles. You cant use FSD on the highway, including the SF/Oakland Bay Bridge. It wont turn on. Autopilot is for the highway. FSD for city streets. Both require you to have hands on the steering wheel and take over if necessary. I don't have a Tesla, but I do have autopilot on my car and I refuse to use it, preferring to always have manual control.
Again, it was Autopilot if not manual control on that Tesla.
I like Musk and own Tesla. Auto-driving has a long way to go, that’s all.
Good comment.
“No doubt the self driving isn’t quite there yet but how is the self driving Tesla at fault for some jerk rear ending it?”
“The police report noted that the Tesla vehicle made an unsafe lane change and was slowing to a stop which led to another vehicle ramming it resulting in a cascade of crashes”
It’s easy to make people rear-end you no matter how much they are paying attention. Jump in their lane right in front of them and hit the brakes. Insurance scammers do it every day.
I left the lane follow on but still had to control the steering as it would drift too far left and right not following a center path.
The first time I had everything switched on, I’d feel the steering wheel pulling to one side and I thought there was something wrong with our brand-new car, then realized that it was steering itself. It does make long journeys on the Interstate a bit easier—but as I said before, being a geezer, I keep my hands on the wheel.
Because the guy who caused the accident can always be relied on to fully, candidly and accurately report what caused the accident!
$200K+
I have stock in Tesla. With what is going on with Musk and Twitter, I do not trust anything.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.