Researchers Tricked Tesla’s Autopilot Systems Into Missing Potential Road Dangers
If you have a $90,000 signal generator laying around, you could give a Tesla owner a really bad day.
Research groups at a handful of global institutions have come together to unmask some of the vulnerabilities of Tesla's Autopilot driver's assist technology. Using materials from basic sound-deadening foam to nearly $100,000 electronic devices, researchers showed that the radars and equipment that keep Teslas between the dotted lines could be compromised without warning.
Tests were carried out by researchers at the University of South Carolina, Zhejiang University in China, and a Chinese security group called Qihoo 360, Wired reports. The tests used various pieces of equipment to test several of the Tesla driver assist functions and the different radar hardware the cars are come with from the factory. The tests were primarily performed on a non-moving Model S
Using a normal AM/FM radio and lights, researchers attempted to fake-out Tesla's main Autopilot sensors. With these pretty-standard items, they were able to convince the car's computer that potential dangers in its path were not actually there, and it also manipulated the sensors to read objects ahead that did not exist.
If your Tesla-hacking budget is more ambitious, a $90,000 signal generator and another piece of equipment called a "VDI frequency multiplier" can be used to blind the Tesla Autopilot's main front radar sensor, taking out its self-driving capabilities. This was the most effective method of disabling the car's self-driving functions, and also the most expensive. To do this, the hackers aimed the pricey equipment directly at the Tesla’s front radar, jamming the car’s outward Autopilot signals and manipulating it into thinking objects were on the road ahead. In the real-world, a scenario like this could cause a seriously dangerous accident.
One of the less costly methods of tricking the Tesla's sensors involved wrapping the sensors in sound deadening foam. With these materials attached, the car's radar had trouble noticing surrounding dangers, but overall this was not as effective as using the almost $100,000 signal generator.
Researchers even found vulnerabilities with Tesla's Summon feature. By sending ultrasonic signals directly at the Tesla's short-range ultrasonic sensors, the car was no longer able to detect distances between the objects that surrounded it, making the low-speed self-driving feature unsafe.
Following this research, the teams stressed that Tesla should look into modifying their products so that people with negative intentions wouldn't be able to confuse the cars. “If the noise is extremely high, or there’s something abnormal, the radar should warn the central data processing system and say ‘I’m not sure I’m working properly.” said Wenyuan Xu, the University of South Carolina professor who spearheaded the tests.
Check out the whole report on Wired.