An Autonomous Uber Claimed its First Pedestrian Victim, but Don’t Expect Washington to Care

Listen up, automakers: The street where my child plays isn’t your autonomous technology proving ground.

byLawrence Ulrich|
Self-Driving Tech photo
Share

0

We all knew it was coming: The first pedestrian in America struck and killed by an autonomous car. She won’t be the last. The death of 49-year-old Elaine Herzberg, who died after being hit by a self-driving Uber vehicle in Arizona, is still no reason to overreact and demonize this budding technology. But it’s yet another reason to reject the idea—suddenly, inexplicably fashionable—that the auto industry should be trusted to regulate itself.

That hazardous-to-your-health thinking has already taken root in Arizona, where Herzberg was struck while walking her bicycle across a nighttime street in Tempe. It was Arizona, and its business-hungry, regulation-averse Governor Doug Ducey, who personally invited Uber—which had been barred from testing in San Francisco after flouting even modest oversight—to ditch California for his state's laissez-faire vibe, instead. “Arizona welcomes Uber self-driving cars with open arms and wide-open roads,” Ducey proclaimed, three weeks into his new job. Like the Beverly Hillbillies in reverse, Uber loaded up its trucks and moved to Tempe, joining its self-driving program in Pittsburgh.

Ducey has dialed back a bit on the Wild West attitude, especially after two serious crashes of Uber Volvos, both blamed on human drivers of the other vehicles involved. But that’s no consolation for Herzberg’s family, or the victim. Tempe police said that the Volvo XC90, one of up to 24,000 that Uber is retrofitting with self-driving tech, was traveling about 38 mph in autonomous mode when it struck Herzberg. (Being struck at 40 mph results in a pedestrian fatality in more than 80 percent of cases). Despite its huge array of sensors, including a spinning rooftop Lidar unit, multiple cameras, and radar, the Volvo apparently didn’t spot Herzberg in time; police said the car did not appear to have slowed. The Uber’s human backup pilot also apparently failed to take over the controls, though she was sprightly enough to get the job despite a reported prison stretch for attempted armed robbery. Still, no jumping to conclusions. Based on video from the Volvo, Tempe police suggested that Herzberg may have been at fault by stepping into path of a fast-moving car so unexpectedly that no reaction, by human or microchip, could have prevented the accident.

Another reflex, one I admit to having myself, would be to point to the 5,997 walkers and bicyclists who were killed in vehicular crashes in America in 2016, among the more than 37,000 total traffic fatalities. Remarkably, 2016’s pedestrian death toll soared by 11 percent—the biggest single-year jump in history. Not one of those pedestrians was killed by a self-driving car. But safety experts are noting how smartphone distractions are surely responsible for some of the rise in fatalities, whether it’s the driver or unlucky pedestrian (or both) who’s using the device. And while I’m not suggesting that the Tempe victim was at fault here, a quick plug for responsibility and safety nonetheless: 75 percent of pedestrian fatalities occur at night; 72 percent are killed while crossing somewhere other than an intersection; and one in three pedestrians killed are legally intoxicated. That’s right: Not drunk drivers, but drunk pedestrians. 

Again, it’s possible that Herzberg would have been killed no matter who was piloting the Volvo. But automakers and transportation providers also have responsibilities. If they shirk those responsibilities, or scoff at them outright, it’s up to our elected officials to remind them who’s in charge. 

For one, the industry contention that self-driving cars are already “safer” than human drivers—backed by dubious statistics from Elon Musk and others—isn’t remotely proven. We, supposedly incompetent Americans managed to drive 100 million miles for every 1.18 fatalities in 2016. Uber’s first fatality, after just 3 million miles of testing, makes humans look brilliant by comparison, giving Uber a fatality rate of 33.3 per 100 million miles. And that’s a meaningless statistic, as well; we have no idea whether autonomous cars, in their fledgling state, are safer, more dangerous or roughly equal to human-driven cars. The sample size is too small to compare a relative handful of autonomous rides—even if you throw Teslas on Autopilot into the mix—with hundreds of millions of conventional cars whose drivers log more than one trillion annual miles in America. A Rand Corporation study determined that autonomous vehicles would have to be tested for hundreds of millions of miles, “and sometimes hundreds of billions of miles, to demonstrate their reliability in terms of fatalities and injuries.” (Emphasis mine.) The report also concluded that test-driving results alone “cannot provide sufficient evidence for demonstrating autonomous vehicle safety and reliability.”

We do have evidence that not all autonomous technology is created equal, and that Uber’s may be especially shit. A 2017 study by Edison Investment Research suggested that Uber’s cars were “5,000 times worse” than Google/Alphabet’s Waymo when it came to how often their systems disengage and require a human pilot to take over. Uber’s self-driving technology disengaged once per mile on average, while Waymo’s cars checked out only once every 5,128 miles. Despite this troubling degree of brain-lock from Uber’s cars (let alone brainless company leadership that tends to deny any responsibility aside from making money) many members of Congress and the Trump administration seem eager to let citizens be the easily squashed guinea pigs for the next big experiment: turning autonomous cars loose on public streets without steering wheels or brake pedals, or even a human back-up inside, as General Motors (for one) is lobbying to do with its Cruise Automation cars.

Do we really need to be reminded of what happens when automakers are left to their own devices? A very brief history would include Pintos flambé, GM’s deadly ignition switches, and the Ford/Firestone debacle. Throw in VW’s recent Dieselgate conspiracy to defraud the U.S. government and consumers, and one might think that elected officials and regulators were working overtime to ensure it doesn’t happen again on their watch. Instead, we're in Bizarro World. Transportation Secretary Elaine Chao keeps yelling that any bid to regulate autonomous cars commits the mortal sin of “picking winners and losers” in technology. Yet the idea that even modest safety oversight of this multi-billion-dollar technology sector is going to stifle competition and development is laughable. The House has already passed a bill that would exempt autonomous carmakers from many safety standards, and prevent states from creating their own restrictions. A Senate version of the bill hasn’t come up for a full floor vote, but they’re working on it, perhaps by letting automakers write it themselves.

Uber has suspended testing in Arizona for the moment, but that's nothing more than damage control. I suspect Uber will reach a settlement with Herzberg's family, regardless of fault, to avoid a public spectacle in court. If this ends up being Herzberg's fault, expect automakers to bring the usual full court press to soothe the public and rationalize their test regimes. 

But screw that. If anything good comes of Herzberg's unfortunate death, it’s that elected officials and regulators might find a conscience, or a spine, and push back against a regulatory blank check for the auto industry. Senator Richard Blumenthal, Democrat of Connecticut, said that this "tragic incident makes clear that autonomous vehicle technology has a long way to go before it is truly safe for the passengers, pedestrians, and drivers who share America’s roads.”

The problem is that automakers don't share that sentiment. Make no mistake: I’m still bullish on autonomous cars' potential to reduce accidents and fatalities, save time and boost productivity, and provide a transportation lifeline for the elderly or disabled. But safety and public goodwill have never been the prime motivation of global automakers. It's about money and market share. As Alex Roy surmised in his inspiring Human Driving Manifesto, companies as old as General Motors and as new as Google have something in common: they hope to earn billions in profits via a largely self-interested movement to tear the steering wheel from human hands—or eliminate it entirely. 

Hey, automakers: If you want to experiment with cars with no steering wheels or driver aboard, be my guest. But do it at your proving grounds, with nice tall fences and security, until the tech is ready for prime time. The street where my daughter plays isn’t your playground.

stripe
Car TechSelf-Driving Tech