Tesla Autopilot Is a Danger to Cyclists, Roboticist Claims

A Stanford University researcher estimates the system only recognized people on bicycles 1 percent of the time.

byWill Sabel Courtney|
Tesla Autopilot Is a Danger to Cyclists, Roboticist Claims
Share

0

Tesla's Autopilot system is one of the best semi-autonomous driving suites in the automotive world, as The Drive's own Alex Roy has repeatedly stated—but it's not invulnerable to criticism. After spending some time behind the wheel of one of Elon Musk's electric vehicles, a researcher from Stanford University has issued a rather strong criticism: Tesla's Autopilot, according to her, exhibits some "frightening" behavior around cyclists

"I’d estimate that Autopilot classified ~30 percent of other cars, and 1 percent of bicyclists," Knight wrote in her piece entitled "Tesla Autopilot Review: Bikers will die" on the blog platform Medium. "Not being able to classify objects doesn’t mean the Tesla doesn’t see that something is there, but given the lives at stake, we recommend that people NEVER USE TESLA AUTOPILOT AROUND BICYCLISTS!" [Exclamation point and caps hers. —Ed.]

She noted that the Tesla she and her colleague tested was equipped with the earlier version of Autopilot known as Hardware version 1. The car, she wrote, had been rented, and she did not receive any training into use of the features. (New Tesla buyers are supposed to receive a walkthrough of the car's features from the dealer before taking the car home.)

Knight—who recently recieved her Ph.D, according to her Medium account—said she drove the Tesla as part of her research into autonomous driving and social robots on the guidance of her boss Dr. Wendy Ju, executive director of interactive design research at Stanford's Center for Design Research. Ju specializes in studying human-robot interaction and autonomous car interfaces, according to the university's Center for Automotive Research at Stanford.

In her piece, Knight stressed that her concerns about cyclists were largely targeted towards the use of Autopilot on its own, not in conjunction with human judgement as the system was intended. (But as we've seen time and again since Tesla introduced the technology, drivers can't always be trusted to use Autopilot responsibly.)

"I’m concerned that some will ignore its limitations and put biker lives at risk," she wrote. "Treating Autopilot as fully autonomous system might be reckless for a person in a car but fatal to a bicyclist, who has a lot less protection. Encouraging a balanced mental model of the machine is exactly the goal of this article."

"But as a human-in-the-loop system," she wrote, "this car’s features would impress Iron Man."

She also ranked several of the Tesla's other features, both those associated with Autopilot and some that aren't. She spoke highly of the situational awareness provided by the display, as it gives the driver an excellent idea of what the car sees—and what it doesn't, such as the aforementioned estimated 99 percent of bicyclists.

The Drive has reached out to Tesla for a public statement; we'll update this story should the company provide one. 

stripe