Posted on 02/19/2020 12:51:54 PM PST by BenLurkin
A group of hackers has managed to trick Teslas first-generation Autopilot into accelerating from 35 to 85 mph with a modified speed limit sign that humans would be able to read correctly.
Hackers at McAfee Advanced Threat Research conducted the experiment.
Ultimately, they were able to make a Tesla vehicle on Autopilot accelerate by 50 mph over the limit:
The ultimate finding here is that we were able to achieve the original goal. By making a tiny sticker-based modification to our speed limit sign, we were able to cause a targeted misclassification of the MobilEye camera on a Tesla and use it to cause the vehicle to autonomously speed up to 85 mph when reading a 35-mph sign. For safety reasons, the video demonstration shows the speed start to spike and TACC accelerate on its way to 85, but given our test conditions, we apply the brakes well before it reaches target speed. It is worth noting that this is seemingly only possible on the first implementation of TACC when the driver double taps the lever, engaging TACC. If the misclassification is successful, the autopilot engages 100% of the time. This quick demo video shows all these concepts coming together.
(Excerpt) Read more at electrek.co ...
Once... Twice... Three times a ..... ::::Fireball and screams::::
The best thing about the Tesla’s autopilot is that you can cut them off with impunity. They will brake and let you in, and then follow at a safe distance.
How many owners will continue to use autopilot when they are constantly cut off?
Such a great scene - Newman was a great character
I found and disabled that feature on my (non-self-driving) Toyota when it tried to kill me by braking when the car it thought was in front of me was in the exit lane and the tail gater did not care how close he got.
“Ive been surprised that the nav screen on a 2019 F-150 knows where the speed limit change occurs within a few yards/meters [even if it gets the speed limit wrong because it was recently chopped by 5 mph - its apparently not reading the value on the sign].”
************************************************************
GIS (Geographical Information System) data is the mapped spatial data residing in your current software that upon receiving positional (GPS/GNSS) information your navigation software “understands” precisely where you are located by comparing current location to the entered (humans enter the recent available speed limit boundaries when building/updating the GIS) speed zone boundaries. In a word...briefly
Speed limit information is built into the GPS maps. My Garmins have had this same feature since my first one around 2008.
The GPS uses this speed limit information to help calculate estimated arrival times based on the route chosen.
I've noticed that when a speed limit changes along a stretch of road, the Garmin or built-in GPS will show the incorrect speed limit. This is usually corrected with the next map update.
I have to laugh, but the Golf Kart crowd will violently object and spin.
“the same amount of effort trying to fool human drivers would be far more successful.”
This.
Put up an “85” sign and darned near every driver will go way over the correct limit. Heck, 85 is normal around here on 55 MPH freeways.
Yes, the machines can be tricked. The difference: per this announcement, Tesla likely will have an update implemented and distributed to all cars by month’s end or so. The drivers around Atlanta will still be ignoring speed limits.
Humans have something called common sense, which overrides some perceptions. Common sense is right enough times that it is essential to the human organism. Yes, it’s wrong sometimes. There is no way to program common sense into a computer. None.
Plus, the human mind has the ability, which it doesn’t always use, to monitor its functioning in real time, and to take measures to get itself back on track. I’d like to see someone try to program that into a computer.
From the article:
“McAfee confirmed that it disclosed its findings to both Tesla and MobilEye before making them public: ‘McAfee disclosed the findings to Tesla ... and MobilEye ... . MobilEye did indicate that the more recent version(s) of the camera system address these use cases.’ In previous instances of vulnerabilities being exposed by white-hat hackers, Tesla has been fairly quick to fix them.”
That’s completely unsafe. That even exceeds rookie cop speed. Double the speed limit plus 10 mph.
I’m wondering if they update the first generation systems - they use MobilEye. Tesla moved to another system - I’d highly doubt they’d create fixes for all the variants. I’m not sure if the MobilEye can even be updated by them.
Flip side: computers are relentlessly focused, while humans get distracted/bored/sleepy. On a long-haul trip I’d rather let the computer handle normal-condition driving.
Statistics on both are being gathered & compared. Comes down to: if absolute unrestricted “per person per mile” death rate of human driver is > 10x that of self-driving computer, the latter should usually be operating ... and apparently we’re pretty much at that point already. Yes there will be special-case differences, and it behooves the humans to learn what those situations tend to be (bad weather, city, obvious oddities) and take over accordingly - making the whole system much safer.
Ironically, the band “Tesla” did a cover of that song.
“Tesla= expensive, dangerous junk.”
You have never driven one!
“The best thing about the Teslas autopilot is that you can cut them off with impunity. They will brake and let you in, and then follow at a safe distance.
How many owners will continue to use autopilot when they are constantly cut off?”
LOL! Don’t brake; RAM THE MF’s!
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.