Hackers Trick Tesla Model S Into Turning Towards Oncoming Traffic Using Stickers On The Road
Keen Labs a renowned cyber-security research group is responsible for putting the Tesla Autopilot to test recently. The team of hackers found a flaw in the system as they made the car steer in the wrong lane by strategically placing some stickers on the road. The potential harm of such an attack is limited though as it does not work when the car is driven in its manual mode.
Tesla has always been very confident about its ¡®Autopilot¡¯ autonomous system and with millions of miles clocked by the users in the self-driving mode, that confidence has very much been intact on the roads as well. A team of Chinese hackers begs to differ though, as they claim that they have successfully managed to trick the car to drive in the wrong lane.
Keen Labs, a renowned cyber-security research group is responsible for putting the Tesla Autopilot to test recently. The team of hackers found a flaw in Tesla Autopilot¡¯s lane-recognition technology, as they made the car steer in the wrong lane by strategically placing some stickers on the road.
For this, they attacked the system in two different ways. First, the team tried to blur the lane dividing line by placing a large number of patches on the road. Though this did fool the system, the team understood that the procedure was way too impractical to be a real life scenario.
So they created a ¡°false lane¡± using just three stickers placed strategically to show as if the lane was being switched to another one. The trick worked, as the Autopilot steered the car away from its present lane and merged onto a lane with oncoming traffic, i.e. the wrong lane.
Tesla misidentifying lane markings (Image: Keen Labs)
Also read: India's First Connected Car Is On The Way, Hyundai's Next SUV Will Come With Built-In E-Sim
¡°Our experiments proved that this architecture has security risks and reverse-lane recognition is one of the necessary functions for autonomous driving in non-closed roads,¡± the Keen Labs wrote in a paper. ¡°In the scene we built, if the vehicle knows that the fake lane is pointing to the reverse lane, it should ignore this fake lane and then it could avoid a traffic accident.¡±
That, however, is not the only learning from the cyber-security tests conducted on the car. The crew from Keen Labs was also able to hack into the car¡¯s connectivity system and control its steering remotely. The video posted by the group shows two of the crew members sitting in the car, driving it through their gamepad.
The potential harm of such an attack is limited though, as it does not work when the car is driven in its manual mode, in reverse or forward, at any speed above 8 km per hour. But it does work ¡°without limitations¡± when in cruise control.
The team even managed to trick the Model S into believing that it was raining, through a specially crafted image. It admitted though that performing such a hack on the windscreen to switch on the wipers is very difficult in a real-world scenario.
Also read: Elon Musk Shows What A Zero-Emission Future Will Look Like With This Image
This is not the first time that vulnerabilities have been discovered in a Tesla model. Even Tesla holds periodical hackathons for ethical hackers to expose such vulnerabilities in its systems. In March this year, the company had given away a Model S, along with a $35,000 prize, to a team of two which successfully managed to bypass the on-board browser in the car. Even Keen Labs has revealed such anomalies in the Tesla models before, having remotely taken control of the brakes of one such model back in 2016.
Tesla has already responded to this particular series of tests. In a statement given to Forbes, a Tesla spokesperson said that the firm had addressed the vulnerabilities around the remote control of steering wheel before the Keen researchers had even been in touch. As for the other issues, the spokesperson mentioned: ¡°The rest of the findings are all based on scenarios in which the physical environment around the vehicle is artificially altered to make the automatic windshield wipers or Autopilot system behave differently, which is not a realistic concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should always be prepared to do so, and can manually operate the windshield wiper settings at all times.¡±