Unpacking California’s Self-Driving Car Reports

Why Google, Nissan, Volkswagen and everyone else still needs a human behind the wheel.

byMichael Frank|
Bosch Professional Cordless Angle Grinder
Share

0

Here’s a fun fact: The liberal government of California requires carmakers who drive cars autonomously on public roads to report how frequently engineers intervene to prevent an accident. I know, I know, just the form of tyranny Donald Trump loves to hate. You want a more fun fact? Volkswagen called its autonomous vehicles Jack and Igor.

That’s the sort of thing you find inside the California DMV Autonomous Vehicle Disengagement Reports. You also learn that Igor is a much better driver. Between October, 2014 and January, 2015, that self-driving Volkswagen rolled off 5,531 test miles and had 85 “disengagements due to failure.” In roughly the same time window, Jack went way farther, but had 175 “disengagements due to failures.” Mostly, though, these reports are dry as toast. At least Nissan’s includes a little excitement: “The Autonomous Vehicle began to merge into a lane behind another vehicle very closely. For safety, the driver overrode the system….”

Google’s report is very deep, because it has more autonomous cars than all other automakers combined: “We’re self-driving 30,000-40,000 miles or more per month, which is equal to two to four years of typical US adult driving,” its report touts. The tech giant’s filing says its autonomous cars traveled 1.3 million miles through November 30 of last year. Nearly 425,000 miles of those were on public roads, and in that period Google’s human drivers needed to take the helm 69 times to prevent contact with another car, object, or person. That’s not bad, truly. Ever ridden shotgun with a 16-year-old? And they’ve got real brains in their heads. (Allegedly.)

Google’s explanations for why humans had to intervene are actually pretty amusing. They run the gamut from obvious (“In one case the other vehicle was driving the wrong way down the road in the SDC’s path”) to slightly alarming. For instance: “‘Unwanted Maneuver’ of the vehicle involves the SDC moving in a way that is undesirable, e.g., coming uncomfortably close to a parked car.”

Google reports a lot of these Unwanted Maneuvers. But there’s a subjective element here, too. Imagine sitting behind the wheel mile after mile not driving, then trying to judge the moment an autonomous car is about to sideswipe a school bus.

The upshot? For a relatively young strain of technology, self-driving cars are in very few accidents. A piece from The Verge cites the much talked-about University of Michigan study in October, which found that self-driving cars are five times as likely to be involved in an accident as cars driven by humans. But that same study also found that autonomous cars do a much better job of not killing people than human drivers.

That said, as Audi explained to me at a Consumer Electronics Show roundtable earlier this month, a robot cannot register eye contact at a four-way stop. It only knows the rules and can sense motion; when another driver breaks those rules, or acts irrationally, the most rational response might be to out-crazy the other driver. Try programing that into an algorithm.

stripe