Posted on 02/19/2020 12:51:54 PM PST by BenLurkin
A group of hackers has managed to trick Teslas first-generation Autopilot into accelerating from 35 to 85 mph with a modified speed limit sign that humans would be able to read correctly.
Hackers at McAfee Advanced Threat Research conducted the experiment.
Ultimately, they were able to make a Tesla vehicle on Autopilot accelerate by 50 mph over the limit:
The ultimate finding here is that we were able to achieve the original goal. By making a tiny sticker-based modification to our speed limit sign, we were able to cause a targeted misclassification of the MobilEye camera on a Tesla and use it to cause the vehicle to autonomously speed up to 85 mph when reading a 35-mph sign. For safety reasons, the video demonstration shows the speed start to spike and TACC accelerate on its way to 85, but given our test conditions, we apply the brakes well before it reaches target speed. It is worth noting that this is seemingly only possible on the first implementation of TACC when the driver double taps the lever, engaging TACC. If the misclassification is successful, the autopilot engages 100% of the time. This quick demo video shows all these concepts coming together.
(Excerpt) Read more at electrek.co ...
Ping.
Oops
I’ve been surprised that the nav screen on a 2019 F-150 knows where the speed limit change occurs within a few yards/meters [even if it gets the speed limit wrong because it was recently chopped by 5 mph - it’s apparently not reading the value on the sign].
I have thought about all the mischief that could be played
with all of these new-fangled auto pilot driving systems.
...fake cross walks...
...cardboard cut-outs of pedestrians...
...fake stop signs...
...ropes pulling things across the roadway...
The teenagers of tomorrow will have a lot of fun!
Honest officer, my computer misread the signs!...................
And the same amount of effort trying to fool human drivers would be far more successful.
You people really need to be brighter.
Not as bad as this:
Boeing 737 cockpit screens go blank if pilots land on specific runways:
http://www.freerepublic.com/focus/f-news/3817232/posts
Good bet Waymo's software wouldn't do this since it relies on Google's mapping. I suspect any speed sign listed faster than their database contains would be rejected.
Wide lanes would be even more luxurious
Tesla= expensive, dangerous junk.
_______________
Wait 'til it sees one of these ........
______
... sounds like old time Halloween stuff
LOL
Just don’t let that junk fall out of the trunk!
Sign, sign, everywhere a sign
Blockin’ out the scenery, breakin’ my mind
Do this, don’t do that, can’t you read the sign?
FTA: "A group of hackers has managed to trick Teslas first-generation Autopilot..."
Are they still using the "first generation" or is there now a second or third generation of software? If yes, then I'm assuming they couldn't get the hack to work on the newer stuff (or they didn't test it for some odd reason). So, as usual, keep your software updated and you'll be much safer from vulnerabilities.
Regardless, it gives the fuddy-duddies and troglodytes plenty to point at and bitch about.
Good stuff on the radio back then
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.