Autonomous Cars Are Getting into Accidents Because They Drive Too Well
Stopping at stop signs and obeying the speed limit is getting self-driving cars rear-ended.
Imagine a world where every driver on the road followed the rules of the road perfectly. Every car makes a complete stop at every stop sign and the speed limit is never exceeded. That’s the world that parts of California are living in where some of the drivers are robots.
With self-driving test mules roaming around California, it’s been found that some of them are overly cautious. Human drivers are so unaccustomed to cars coming to a complete stop and obeying the speed limit that it’s resulting in a lot of low-speed accidents according to a Bloomberg report.
"They don’t drive like people. They drive like robots," Mike Ramsey, an analyst at Gartner who specializes in advanced automotive technologies, told Bloomberg. "They’re odd and that’s why they get hit."
California is the only state that requires reports on autonomous vehicle accidents. According to the California DMV’s website, 43 autonomous vehicle accidents have occurred so far. A self-driving car was rear-ended in 13 of those accidents and almost all accidents occur at intersections at low speeds with no injuries.
The companies developing self-driving cars are in the process of figuring out how to make autonomous cars integrate better with human-operated cars. GM’s Cruise Automation CEO Kyle Vogt said in a blog post that his brand’s autonomous Chevy Bolts are "designed to emulate human driving behavior but with the human mistakes omitted."
Google's self-driving car arm, Waymo, is doing something similar by tweaking things like turning wider and inching forward at flashing yellow lights in order to not only be safer for other motorists but more comfortable for the autonomous car’s occupants.
Do robots need to drive more like humans or do humans need to drive more like robots?