Posted on 03/18/2025 7:47:16 AM PDT by ImJustAnotherOkie
f you need more proof that electronic driving aids don’t make a car autonomous, here it is. YouTuber Mark Rober put his Tesla Model Y’s Autopilot technology to the test by setting up a Looney Toons-style trap. Rober set up styrofoam wall with a picture of a road on it, the middle of an actual road, to see how the system reacts. The test sheds light on the differences between cameras and lidar.
Rober put the Model Y, which has Autopilot technology relying on cameras, head to head against a Lexus RX-based prototype fitted with a lidar. Autopilot passes the first two tests, which include stopping for a mannequin standing in the road and stopping for a mannequin that runs into the road, but it doesn’t detect the mannequin in fog and in rain. Lidar sees the kid-sized mannequin regardless of road conditions.
The final test draws inspiration from the famous Wile E. Coyote versus Road Runner cartoons—meep, meep! The wall stretches the entire width of the road and it blends in surprisingly well with the landscape surrounding it. It hopefully can’t fool a human driver, however, and it doesn’t fool the lidar-equipped Lexus, either. The prototype detects that it’s speeding toward a wall and stops without drama. The Model Y starts driving toward the wall at about 15:00 in the video. Rober reaches 40 mph, engages Autopilot, and crashes right through the wall.
The test took place in broad daylight, without rain or fog, so it’s not like the Tesla didn’t see that it was driving toward a wall. However, one of the key differences between the two technologies is that the lidar is scanning the road ahead and detecting the wall. What’s printed on the wall isn’t taken into account by the car’s brain. Autopilot’s cameras, on the other hand, rely on what they see, and in this, case that’s a road.
Sign up for The Drive Daily Get the latest car news, reviews, and features.
Email address Enter your email Sign Up By signing up you agree to our Terms of Service and Privacy Policy.
Only two cars were included in the test, but I’m guessing that many other camera-based semi-autonomous systems would have failed as well; The error isn’t Tesla-specific by any means. And, granted, the odds of encountering a wall that looks almost exactly like a road while you’re commuting are pretty low, but the test does a pretty good job of highlighting what each technology is (and, crucially, isn’t) capable of.
And, on a secondary level, the Wile E. Coyote test shows that Autopilot doesn’t make a car autonomous by any stretch of the imagination. It’s a Level 2 system, meaning the driver needs to keep both eyes on the road and both hands on the steering wheel.
Tesla CEO Elon Musk once called lidar “a crutch,” but this isn’t the first time we’ve seen videos that highlight the limitations of camera-based electronic driving aids. In 2022, a Model Y failed to detect a dummy that was standing in the middle of a road at night.
Idiotic “test,” as this “scenario” would never occur.
Yup, unrealistic.
At night, in the rain, I have been very bewildered about what I was driving toward. I trust my driving a lot more than the Tesla auto-pilot.
Who’s to say what would happen with a billboard in the wrong place. Not a problem for a human but software can be fooled.
Have you ever looked ahead and been unsure what you're seeing? This shows that the car can have the same confusion. It's a realistic test from the standpoint that if you're presented with an ambiguous scenario, the car just might charge ahead.
This exact scenario could play out where you are going up a bridge with a billboard of an open road ahead of you...
What kind of mannequin does he have that runs into the road?
After hitting the mannequin, does the car back up and run over it again, to make sure it can’t testify against the car?
Rober is a genius and has a great youtube channel. His squirrel videos are must see.
And yet it did occur on multiple cartoons!
Mostly, but it does show the difference in the capability of the two systems, however the body in the road in foggy and rainy conditions is not unrealistic.
Curious, if this ever becomes a thing, and an accident or worse death occurs under autodrive. Who is legally responsible?
The average American mouth breather would also drive righe into that image. Same.for the teenager who is texting and driving or some other idiot tictok activities in the driver seat.
That said a fusion of either stereo optic vision and LIDAR or MMW radar seems to the best way for active emergency braking and collision avoidance. MMW has the added advantage of being unaffected by fog or snow where as LIDAR is to various degrees. Using stereo optics plus a single single beam look ahead MMW Doppler radar solves this issue for good. The radar beam would sense the approaching stationary object due to it’s Doppler effect. Styrofoam is transparent to radar so it would need to be a real object of mass like a person or a car or motorcycle.
...or approach a raised draw bridge.
A few years ago, a woman “driving” her Tesla on autopilot (while reading a book) rear ended a vehicle that was at a complete stop, killing the driver in that vehicle. Her car was going over 80 MPH. This happened about 2 miles from where I am sitting right now.
Or maybe the Tesla is really smart and KNEW it was styrofoam so it ran right through it. He needs to test with a picture of the road on a rock wall and see what happens...
“Who is legally responsible?”
The operator of the motor vehicle is always responsible for the actions of that vehicle and everything inside of it as well. Ask a cop friend about that last point or just watch On Patrol Live. People do years of jail time because one of their passengers brought drugs into the vehicle.
It’s the same for aircraft the PIC (pilot in command) is always responsible for all actions of the aircraft regardless of if the autopilot is engaged on that aircraft.
Street Artist Painted a Road Runner Tunnel On A Wall, Someone Tried to Drive Through It
The Drive has been anti-Tesla since Musk bought Twitter.
I understand there’s a difference between Tesla Autopilot and Tesla Full Self-Driving - would FSD have been similarly challenged?
I am leery of this technology. I am concerned that some people will not monitor. A human pilot who is using auto pilot still monitors the flight.
At night, in the rain, with all the reflections off of the wet road, it's hard to tell what the AI would make of it.
My 2013 lane-keeping system doesn't do well in Texas with their white cement roads and white painted lane stripes. I can't imagine what an AI would do with traffic lights, street lamps, and advertising signs all reflecting off of the wet road.
-PJ
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.