Uber Pedestrian Death Might Force Self-Driving Car Makers to Pump the Brakes
Autonomous vehicle development is happening too fast, some say—but others think accidents are simply inevitable.
Sunday night’s fatal accident in Tempe, Arizona involving an autonomous Uber and a pedestrian or cyclist will likely have strong implications on the development of autonomous vehicles, as it’s the first such death in the quest for self-driving cars.
Elaine Herzberg, 49, was struck by Uber's self-driving Volvo XC90 SUV at around 10 p.m., according to police, and died later at the hospital. There was one occupant in the vehicle; Uber has confirmed the car was operating in autonomous mode, with the occupant present as a safety backup. An investigation is expected to determine what, precisely, caused the accident, but the mere fact that it happened is already causing much speculation.
For some, it’s a sign that things are moving too fast. “The state of California has said that next month, they plan to allow autonomous cars on the road with no human supervisor in the car,” said autonomy engineer Missy Cummings, director of the Humans and Autonomy Laboratory at Duke University. “One would hope this fatality causes them to rethink this plan. It is very sad that someone had to die for NHTSA and Congress to pay attention to the fact that these technologies are still very experimental and not ready for widespread deployment.”
Congress has been wrestling recently with the degree to which it should regulate the development of autonomous vehicles. In fact, a report in Automotive News earlier this month highlighted the perils of legislation being pushed through Congress that will loosen the restrictions on the development and testing of self-driving cars—specifically noting shortcomings in their ability to identify cyclists, in particular. Meanwhile, the piece described oppositional efforts from the automotive industry to speed the legislation along.
Cummings says that the rollout of this new technology needs to be transparent and carefully modulated no matter what timeline it happens on, given the uncertainty about the capabilities brand-new on-board sensors have. “There is no question that the perception systems in these cars are the long pole in the tent,” said Cummings, who coincidentally released a paper on the subject just one day after the accident. “Researchers are still learning about the weaknesses of these systems and if we are still uncovering new problems in basic research, this is not the time to be releasing this technology on a broad scale.”
She added that the process shouldn’t be halted, but regulators should start insisting that companies share with local municipalities their test data so better-informed decisions can be made as to when and how to expand test markets.
“Companies are moving too fast to implement the technology,” she said.
The accident will also likely put an already-beleaguered company under the microscope. “This will test whether Uber has become a trustworthy company,” said Bryant Walker Smith, an autonomy specialist at the University of South Carolina School of Law. “They need to be scrupulously honest, and welcome outside supervision of this investigation immediately. They shouldn't touch the systems without credible observers.”
Smith also noted, however, that there shouldn’t be a kneejerk reaction to this incident. “On the same day this tragedy happened, 100 other people died in crashes in the United States alone,” he said. “Their deaths are also tragic.”
Indeed, every year, car accidents cause roughly 40,000 fatalities in the United States. That’s roughly the equivalent of a fully-loaded Boeing 717 airliner crashing every day. Globally, more than a million die—equivalent to 35 such airline crashes daily. The urgency to eliminate those deaths needs to be maintained, he suggested.
“Developers and regulators knew that tragedy was—and still is—a possibility. Frankly, I'd be much more concerned if they believed otherwise,” he said. “But while should be concerned about automated driving, we should be terrified about conventional driving.”
Smith emphasized, however, that Sunday night’s accident also shouldn’t be interpreted with too much statistical generosity, either. “On average, there's a fatality about once every 100 million miles [of human driving],” he added. “So while this incident is not statistically determinative, it is uncomfortably soon.”
The accident will also likely prompt all parties to examine more closely the interaction between pedestrians, self-driving vehicles, and the infrastructure that supports them, said Neal Walters, a partner at Philadelphia law firm Ballard Spahr, which specializes in legal matters in the automotive sphere. “This is an important data point in understanding how to continue to develop AVs in an intermodal transportation system,” Walters said. “[But] it is premature to predict what if any influence this may have. Care should be taken to not jump to any conclusions about the performance of the vehicle, or the behavior of the pedestrian, until a full accident reconstruction is completed.”
To this point, the National Transportation Safety Board tweeted late Monday afternoon that it was sending a team to the investigate the collision, which is no surprise given their recent interest in other autonomy-related accidents. The board indicated that it will focus on the vehicle’s interaction with the environment, other vehicles, and vulnerable road users such as pedestrians and bicyclists. It will likely also examine the occupant’s possible role in the accident—something that won’t be difficult, as Tempe police have already indicated they have access to cameras showing both the outside of the car during the incident and the driver on the inside. Given that many automakers are advocating for driver involvement in semi-autonomous systems as safety backups during the interim steps on the way to full autonomy, this strategy could now come under equally intense scrutiny if it’s proven that the driver wasn’t as attentive as he or she should have been.