Posted on 05/05/2025 7:53:13 AM PDT by ProtectOurFreedom
Tesla is fighting back at criticism of its Full Self-Driving features. The EV automaker took to the social media platform X to provide a robust defense of FSD's current state and future.
While few would take issue with this point, many consumers have been underwhelmed with FSD's view of autonomy thus far, due to the need for drivers to be attentive.
Tesla pushed back, "Although FSD Supervised currently does require your supervision, you will still notice that your commute or long drives are suddenly so much less taxing."
Teslarati added context, saying FSD "has a ton of potential." And there are supporters of FSD who have credited the feature with not only saving them from dangerous potential collisions, but also making often-stressful driving more pleasant. A Tesla owner in China shared remarkable footage showing FSD thriving in difficult conditions.
Tesla laid out the selling points of FSD: "No constant micro-adjustments in rush hour traffic. No frustration. Car does it all for you."
Not everyone has had the same experience with FSD, as there are drivers who anecdotally say the tech has made inexplicable maneuvers into traffic, and YouTuber Mark Rober went viral showing Tesla's Autopilot system, which relies on the same hardware as FSD, fail to stop the vehicle from crashing into a wall painted to look like additional road.
Rober's criticism is that Tesla uses only cameras, albeit with great software, instead of including any radar or lidar detection. Tesla's China rollout also faced challenges with drivers getting fines as the result of system errors.
Tesla is undeterred by these developments, as they are looking to go a step further by introducing Unsupervised FSD to Austin by June. Two of their Gigafactory locations already use the tech for operations, and through the criticism, Tesla is moving forward quickly.
(Excerpt) Read more at tech.yahoo.com ...
We have robotaxis in the Phoenix area. Google’s Waymo has been active for several years. The other day, I saw one pull up to the library and an obviously blind man, complete with white cane exited. I cannot argue with that use.
Thanks for the report. I pretty much decided this will be my next car...tired of the stressful drives on I-5. Yesterday I left my sister’s house earlier than I wanted to because I didn’t want to be driving after sunset. How does this work at night?
This is a step forward for flying cars(likely over the highways)
When FSD meets BSD, you die...horribly.
The braille keypads on ATM's were just a little ahead of their time.
“How does this work at night?” The sensors probably see FAR better than the human eye at night. Just making an educated guess here. Cameras in visible wavelengths might have glare problems like human eyes, but LIDAR won’t. Currently, Tesla relies on a camera-based system for its self-driving technology and does not use LiDAR in its production vehicles. However, it has purchased LiDAR sensors for testing and validation purposes to improve its vision-only system’s accuracy.
The braille dots on baseballs never did catch on, though.
...deny municipalities easy money by allowing a way to drink and travel in one's own vehicle without endangering anyone.
Just when car manufacturers are about to be forced to install remote cut-offs that the police will use to override one's control, too.
I do think the democrats will fight the fun parts of auto driving, no matter how safe.
Drivers and passengers end up in the hospital all the time with human driven vehicles.
“Car does it all”?
What will it do when someone on a bike drives between you and the car in the lane next to you? What will it do when you’re rounding a corner and there’s a kid playing with a ball ahead of you on your right and the ball goes rolling in front of you? Will it be able to handle “Park next to my green SUV in front of the left garage door”?
How about a road construction area? Will it read the signs? Will it handle a detour? Will it refuse to take over when there is a heavy rain/snow?
If it can’t do all of these it shouldn’t try to do any of these. The human remains responsible for everything the automation does.
I will not give up control and let some AI drive me. I might ride in a Johnny Cab, maybe.
This is an old story, now 8 years old. On March 23, 2018, a Tesla crashed on California State Route 101 near Mountain View, which killed 38-year-old Apple engineer Walter Huang. The crash was caused by a combination of factors involving the Tesla Model X’s Autopilot system, driver behavior, road conditions, and infrastructure issues. Now, obviously the Tesla self-driving feature has improved a lot in 8 years, but this points out an unfortunate case where all the holes in the Swiss cheese lined up.
The Tesla, traveling at approximately 71 mph, was in the high-occupancy-vehicle lane using Autopilot (Traffic-Aware Cruise Control and Autosteer). It veered left into the paved gore area (divider) between Highway 101 and the Highway 85 exit ramp, striking a damaged crash attenuator barrier. The impact caused the Tesla to catch fire, and it was then hit by a Mazda and an Audi in adjacent lanes.
The NTSB and other investigations identified multiple contributing factors, with no single cause definitively pinpointed:
Tesla Autopilot System Limitations: The Tesla was operating on Autopilot, which was engaged with adaptive cruise control set to minimum follow-distance. The system failed to detect faded lane lines and steered the vehicle into the gore area, accelerating to 70.8 mph before striking the barrier.
Faded Lane Markings: The NTSB noted that the highway’s left lane lines were faded, causing Autopilot to follow a bolder, incorrect line into the barrier.
Camera-Based System: Tesla’s Autopilot relied on cameras, not lidar or radar, which struggled with worn lane markings or branching lanes.
Prior Complaints: Huang had reported to family and Tesla that Autopilot repeatedly swerved toward the same barrier on prior drives, but Tesla couldn’t replicate the issue at the dealership.
NTSB Critique: The NTSB criticized Autopilot as a “Beta system” not fully developed, noting it allowed drivers to disengage attention, and Tesla’s marketing overstated its capabilities, fostering over-reliance.
Tesla’s Defense: Tesla claimed Autopilot wasn’t a hands-free system, requiring driver supervision, and that Huang didn’t respond to warnings. They also noted 85,000 safe Autopilot trips on that stretch since 2015.
Driver Distraction: The NTSB reported Huang was playing a video game on his smartphone at the time of the crash, suggesting distraction contributed. Data showed his hands were off the wheel for six seconds before impact, despite earlier visual and audible warnings.
Context: Huang had five seconds and 150 meters of clear view of the barrier but took no action, per Tesla’s logs.
Bold defense of self-driving tech: ‘Car does it all for you’.
The car runs over arsonists who touches them.
My wife got a free month of self driving on her Tesla. We only used it a few times and hated it, scary. Going through a local town here, there are no pavement markings on the right side, which it uses to center itself. Almost ran into a curb that sticks out. Then, in the local Wallymart, it couldn’t navigate all of the islands in the parking lot and got confused. She used it on the turnpike, and it got confused as to which tollbooth to take, and stopped.
If it is an EV; definitely not. I’m not sure about having self-driving in a conventional vehicle as electronic failures can happen just like human failures. Maybe self-driving would be somewhat acceptable if you’ve reached the point in life where your driver’s license has been pulled.
Tesla Wants You to Stop Filming Its Cars Mowing Down Child-Sized Mannequins
I refuse to even ride shotgun with a live driver so all of this AI autopilot crap is never going to see my rump in the seat.
Hard. Pass.
As of now you can’t use voice commands to control driving. To speed up you either tap the accelerator or use a scroll wheel. The scroll wheel sets the maximum speed higher or lower. It doesn’t directly set a specific speed.
To trigger a lane change just use the directional signal stalk (button on some models) car will safely change lanes. Unfortunately if you want to take another route and lets say make an immediate left an intersection instead of go straight, you have to exit the FSD and do it yourself.
Essentially, when FSD is on, it’s designed to be autonomous and overriding it isn’t always seemless.
In all the obfuscation, muddied water and outright prevarication, the main point is missed.
Self driving cars may be offered to the publc but the primary market is Cyber cab. The cyber cab is a threat of sea change proportions. It is ready to go, now.
You tell Siri to bring up cyber cab, you give the address of the destination and perhaps where you are. The driverless cab arrives and deposits you at the destination. you get out and leave....... that’s it. Your phone and the cab are smart enough to provide a bill for services that is paid by the phone. You do nothing but place th3e order and then ride.
A very interesting thing happened last week. Serious negotiations for Cyber cab service in Riyadh took place. Yes, women can drive but for many reasons may not. They are not allowed to drive with anyone but a spouse or close relative. The cyber cab solves that delimma.
I expect Riyadh to be a successful cybercab test market
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.