Researchers Fool Self-Driving Cars With Stickers on Street Signs

Autonomous cars may not be that smart after all.

byStephen Edelstein|
Self-Driving Tech photo
Share

0

University of Washington researchers have found a very simple way to trick self-driving cars into misidentifying road signs.

The researchers found that strategically-placed stickers are enough to fool the image-processing software in autonomous cars, according to Car and Driver. Spookily, in one experiment, stickers attached to a stop sign caused sensors to misidentify it as a speed-limit sign. 

Security researchers exploited this flaw by essentially creating an analog hack. Instead of breaking into a car's computers and sending illicit commands, the researchers simply examined the programs cars use to identify different objects. Along with a photo of the target sign, they were able to create printable stickers based on known faults in how the system processes images.

Most autonomous-driving systems compare what the car is "seeing" through its cameras to stored images, so changing the appearance of an object can cause the system to make a mistake. Researchers have been aware of this problem for some time. In India, engineers have had trouble getting self-driving cars to identify the ubiquitous three-wheeled auto-rickshaws, as drivers alter the vehicles in so many ways that they become unidentifiable to sensors.

But the trick described University of Washington researchers is particularly nefarious, because it could be used to cause havoc in the real world. The stickers are effective enough to work in most conditions, but subtle enough that they may not immediately be noticed by human drivers or the police. The stickers could even mess with the speed-limit recognition systems already available on some production cars.

University of Washington researchers say the trick isn't foolproof, though. Automakers can use contextual information from maps and the environment to help cars make the correct identification. A car should be able to figure out that a 65-mph speed limit sign doesn't belong on a city street, they say. Redundant sensors could also act as a failsafe.

Ultimately, self-driving cars may need to develop the very human ability of identifying variations in an object's appearance. A startup called Cortica is developing so-called "unsupervised machine learning" for self-driving cars, which could give them that ability, something we all take for granted.

stripe
Car TechSelf-Driving Tech