Report From Startup Drive.ai Shows Autonomous Cars Aren't Perfect

Drive.ai cars disengaged from autonomous mode 151 times last year.

Drive.ai

Autonomous-driving startup Drive.ai has released its 2017 "autonomous vehicle disengagement report." These reports are mandatory for companies testing self-driving cars in California, listing important information like miles driven and the number of incidents that occurred while covering those miles. According to its report, Drive.ai performed well, but not perfectly.

The company's self-driving cars covered 6,572 miles in autonomous mode last year, according to the report, but they also disengaged from autonomous mode 151 times, requiring a human driver to take over. But Drive.ai also claims that its seven autonomous cars weren't involved in any crashes and that the number of miles driven between disengagements was increased significantly, from around three miles in 2016 to around 100 miles in November 2017.

Drive.ai received its permit to test self-driving cars on California roads in April 2016. Its autonomous fleet includes an Audi A4, three Lincoln MKZs, and three Nissan NV200s. All testing currently takes place in the San Francisco Bay Area. The startup is expected to partner with Lyft on an autonomous ride-sharing pilot in San Francisco, similar to the program Lyft is already running with NuTonomy in Boston.

Other companies' 2017 reports haven't been made public yet, but in its 2016 report, General Motors' Cruise Automation said its cars covered 9,776 miles, and that human drivers had to take back control 181 times. Waymo self-driving cars racked up 635,000 miles in California last year, disengaging from autonomous mode 124 times.

It's also worth noting that Drive.ai's avoidance of crashes in 2017 may be due as much to luck as the capability of its cars. Cruise reported multiple crashes last year but claims its cars were not at fault. It's now being sued by a motorcyclist who claims one of Cruise's autonomous cars hit him while making an improper lane change. Cruise claims Nilsson was at fault because of unsafe lane splitting.

Drive.ai's avoidance of crashes and relatively low rate of glitches are exactly what supporters of self-driving cars want to see. But even that performance may not be enough to convince a skeptical public. Recent polls show that most Americans don't really trust self-driving cars. The question is whether companies and lawmakers will wait to gain the public's trust or push ahead with autonomous cars that may be imperfect.