MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
into
Search

Researchers Trick Tesla Autopilot Into Steering Into Oncoming Traffic

Tuesday April 2, 2019. 03:00 PM , from Slashdot
An anonymous reader quotes a report from Ars Technica: Researchers have devised a simple attack that might cause a Tesla to automatically steer into oncoming traffic under certain conditions. The proof-of-concept exploit works not by hacking into the car's onboard computing system. Instead, it works by using small, inconspicuous stickers that trick the Enhanced Autopilot of a Model S 75 into detecting and then following a change in the current lane. Researchers from Tencent's Keen Security Lab recently reverse-engineered several of Tesla's automated processes to see how they reacted when environmental variables changed. One of the most striking discoveries was a way to cause Autopilot to steer into oncoming traffic. The attack worked by carefully affixing three stickers to the road. The stickers were nearly invisible to drivers, but machine-learning algorithms used by by the Autopilot detected them as a line that indicated the lane was shifting to the left. As a result, Autopilot steered in that direction.

The researchers noted that Autopilot uses a variety of measures to prevent incorrect detections. The measures include the position of road shoulders, lane histories, and the size and distance of various object. [A section of the researchers' 37-page report] showed how researchers could tamper with a Tesla's autowiper system to activate wipers on when rain wasn't falling. Unlike traditional autowiper systems -- which use optical sensors to detect moisture -- Tesla's system uses a suite of cameras that feeds data into an artificial intelligence network to determine when wipers should be turned on. The researchers found that -- in much the way it's easy for small changes in an image to throw off artificial intelligence-based image recognition (for instance, changes that cause an AI system to mistake a panda for a gibbon) -- it wasn't hard to trick Tesla's autowiper feature into thinking rain was falling even when it was not. So far, the researchers have only been able to fool autowiper when they feed images directly into the system. Eventually, they said, it may be possible for attackers to display an 'adversarial image' that's displayed on road signs or other cars that do the same thing. In a statement, Tesla officials said that the vulnerabilities addressed in the report have been fixed via security update in 2017, 'followed by another comprehensive security update in 2018, both of which we released before this group reported this research to us.' They added: 'The rest of the findings are all based on scenarios in which the physical environment around the vehicle is artificially altered to make the automatic windshield wipers or Autopilot system behave differently, which is not a realistic concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should always be prepared to do so and can manually operate the windshield wiper settings at all times.'

Read more of this story at Slashdot.
rss.slashdot.org/~r/Slashdot/slashdot/~3/cRbAgcPJjsY/researchers-trick-tesla-autopilot-into-steering...
News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Apr, Tue 23 - 09:33 CEST