Can Tesla Solve its Autopilot Problem?

We reconstruct the fatal Autopilot accident scene to find out.

byAlex Roy|
Electric Vehicles photo
Share

0

Before Joshua Brown was killed in his Tesla Model S while alledgedly watching a movie on Autopilot, I had a conversation with my friend, Comms. “The first person to kill someone in a Tesla on Autopilot,” Comms said, “is going to be responsible for 340,000 deaths.”

Comms is an old friend working in communications for a major automotive manufacturer. He’d just spent an hour failing to convince me Elon Musk was the modern Preston Tucker, but I couldn’t argue with his newest line of reasoning.

“Nonsense,” I said. “It’s great. I know its limitations.”

But he was right. I did almost kill 340,000 people the last time I drove a Tesla on Autopilot. It was amazing how close I came. There they were, lined up on both shoulders of the Interstate like luminous bowling pins waiting to be mowed down. I remembered how well Autopilot worked, and wanting to close my eyes, or watch a movie, or open my laptop and answer emails.

“It doesn’t matter,” said Comms. “If you don’t kill someone. Someone else will.”

“Maybe.”

“Definitely. Every time you try to set a Cannonball record on Autopilot, you run the risk of an accident that will set back the whole industry ten years. It’s dangerous—”

“So if Autonomous Driving is supposed to cut fatalities by 90%, and 38,000 Americans were killed last year, if I become that guy I’m responsible for—”

“Killing 34,000 people. Every year. For ten years. 340,000 deaths.”

“Legally, the driver is still responsible.”

Comms smiled. “Do you really think that matters?”

It was a pretty convincing position. So convincing that I promised myself I wouldn’t become that guy, and that the next time I set Autopilot to 90mph I would follow the car's warnings to the letter.

"The system is in beta. Be prepared to take over at any time. Pay attention."

That's right: Pay attention. Then tragedy struck, because a Tesla owner named Joshua Brown didn’t.

Whose Fault Was It?

This one’s easy. If you believe in a nanny state whose logical conclusion is Wall-E, the fault lies with Tesla. If you believe in personal responsibility, it was Joshua Brown’s.

We may never know for sure, but you don’t need to be Nostradamus to know pretty much how this went down. You just need to have spent more than the length of a press junket using Autopilot, both to fall in love with it and to see precisely how it was likely to go wrong.

And did.

I hate to say it, but it was probably Brown’s fault.

How can I be so callous? Because I’ve driven thousands of miles in a Tesla on Autopilot, I was on the team that set the EV and Semi-Autonomous Cannonball Run records in a Model S, I know exactly what it’s like to have faith in a technology so brilliantly executed, and I know how to read a Police Report and use Google Maps.

Let’s Get Started

“Nobody knows anything,” said William Goldman of the film business, but he could have been talking about media’s coverage of the world’s first Autonomous Driving fatality. I’ve read no more than a handful of intelligent articles on the tragedy. The rest have been the standard bukkake of clickbait reposts. Googling “Tesla accident death” yields 200,000+ stories with Driverless or Self-Driving in the headline.

Guess what? A Tesla with Autopilot isn’t a Self-Driving Car.

It operates at what’s called Level 2 Autonomy, and Brown—an ex-Navy SEAL, tech executive and self-proclaimed Tesla evangelist—must have known this better than anyone. According to NHTSA, Tesla and anyone who has ever enabled Autopilot via the Tesla UI and used it for more than sixty seconds, it may disengage anytime, and the driver must be ready to take over.

Let’s Go To The Map

This was not a complex accident. Brown was headed eastbound on US-27A, traveling at somewhere between 65 and 90 mph—Autopilot’s upper limit. The speed limit is 65 mph. Weather conditions were perfect.

There was no indication the Tesla’s brakes were applied before initial impact with a tractor-trailer truck.

The police report describes the truck executing a left turn to head south on 140th Court when it was struck mid-trailer by Brown’s eastbound Tesla.

Here’s what it looks like in Google Maps, with a just over 1200 feet separating the point of impact from the left edge of the map. There is a small rise approximately 600 feet west of the point of impact, just east of 138th Terrace:

Here’s what it would have looked from Brown’s POV, 1200 feet from the point of impact. The crest is visible halfway to the point of impact:

Here’s what it would have looked like from Brown’s POV, 700 feet from the point of impact, from the top of the crest, just west of 138th Terrace:

Was It Fate?

If there is such thing as an unavoidable accident, this would not appear to be one of them. A white tractor-trailer preparing to make a turn would be visible even over the crest at a distance of 1200 feet, and almost doubly so once Brown crested the rise at a distance of 700 feet.

Is it possible the truck turned at the last possible second? Of course, but the point of impact suggests the truck had almost completed its turn. No one has suggested the truck stopped mid-turn, which means it was still in motion. A tractor-trailer is a large object, nearly impossible for the human eye to miss even at a distance of a quarter mile.

Was it the trucker’s fault? Unlikely.

Let’s give Brown the benefit of the doubt and assume he was going 60mph. What is the stopping distance of a Tesla Model S at 60mph?

108 feet.

If Brown had been paying attention—whether or not Autopilot had been enabled—he would have had more than enough time to stop the car, had he chosen to.

If one of the witnesses is correct and Brown was traveling at 90 mph, he still would have had nine seconds and 1200 feet during which he could have stopped the car, or at least slowed it enough to make the impact easily survivable.

But wait, is it possible Brown had a reasonable expectation that his Tesla would “see” the truck and Automatic Emergency Braking would engage?

Reasonable, sure. But Brown—an avid Tesla fanboy—surely knew Autopilot’s limitations, especially that Autopilot likes to brake late, and aggressively. If my Tesla were approaching a white truck crossing the highway perpendicular to my path—and I was doing 90, or even 60—I wouldn’t wait for the Tesla to react.

My foot would naturally move to the brake pedal.

But that could only happen if I was paying attention. The lack of evidence of any braking or steering inputs suggests Brown never saw the truck. He couldn’t have missed it had he been looking up, but Brown was allegedly watching a Harry Potter movie at the time.

Fate.

Was It Autopilot?

Tesla’s critics would have you believe Autopilot is critically flawed, from hardware to software to user interface. That might be true if Tesla claimed to be selling a Driverless/Self-Driving Car, but they’re not.

Brown knew this.

Their critics would also have you believe Autopilot failed to deliver on the promise of its brand name, suggesting Autopilot doesn’t actually function as an “autopilot.”

Guess what? Planes with autopilots still have pilots. Boats with autohelms still have captains. Humans remain “in the loop” because things can go wrong even in a final version. In today's world, there is no final version.

Brown knew this.

An ex-Navy SEAL would be more likely to understand the need for a human in the loop, especially in a Beta release. To suggest Brown was a victim of aggressive marketing is to insult a man better equipped to understand such technology than 95% of Tesla owners and 99% of journalists writing about the crash.

Ex-Navy SEALS aren’t known for shirking personal responsibility. He would have understood that the driver remains legally responsible at all times, and yet all evidence suggests he became overconfident in the system, and paid the final price for it.

What could possibly compel him to do so?

Not a Defect, A Feature

What makes Autopilot so much “better” than rival systems? It isn’t the sensor hardware, which it largely shares with competing models from Mercedes and Volvo. It’s the software. Autopilot’s AI is derived from 140+ million miles of driving data collected from Tesla owners, a dataset vastly larger even than Google’s. Most importantly, Tesla has unshackled Autopilot from the bonds every rival’s legal department have insisted upon:

The hand-on-wheel interval.

Get into a new Mercedes E-class or Volvo S90, engage whatever they call their L2 Autonomy, take your hands off the wheel, and you’ve got 30-60 seconds before—

Ding. Ding. Place your hands on the wheel.

Autonomous Driving this is not, but it’s not meant to be. L2 systems are meant to be semi-autonomous. Mercedes and Volvo will be the first tell you these are Advanced Driver Assistance Systems (ADAS), and that the interval is there to keep the driver engaged and alert.

Get in a Tesla and engage Autopilot in perfect conditions, say, a straight road with clear markings and good weather?

A lot longer than 60 seconds.

I’ve sat in a Model S that drove itself for several minutes. I’m almost afraid to say how long, because once you’ve mastered matching Autopilot’s speed to conditions, it’ll go a lot longer. Stick a water bottle in your steering wheel and it might go for an hour. A rubber band looped around an iPhone mount might even get you two hours.

Autopilot may be L2, but once you get past two or three minutes, it sure feels fully autonomous.

Tesla could cut their interval overnight with a wireless software update, but then both they and their customers would lose something they both crave. Consistent, uninterrupted use of Autopilot is how Tesla has gathered so much data, which—via what they call Fleet Learning—improves Autopilot’s AI at a rate no rival can match.

The Mercedes and Volvo systems? It’s almost impossible to know how good they are. Short intervals force your hands back on the wheel so often you end up leaving them there, just as their legal departments hope you will.

Safety, and all.

To a Tesla owner, the Mercedes and Volvo intervals feel like defects. To Mercedes and Volvo owners, Tesla's feels like freedom, albeit a potentially dangerous one.

Understand the Interval, Understand the Accident

It’s not hard to imagine a scenario where Brown, judging conditions to be perfect, engaged Autopilot and began watching Harry Potter. Even without a water bottle or rubber band, the Tesla would likely have continued without an interval alarm for several minutes or longer, long enough for Brown to become complacent. Given his the near-miss he uploaded to Youtube, he would have been confident in Autopilot’s ability to save him if necessary.

If the last interval alarm was 60 seconds before the impact, at 60mph he would have been at least a mile away from the crash site—and even further away from the truck—which he would not have been able to see at that distance.

Once he took his eyes off the road, the accident was probably inevitable.

Could the latest Mercedes E-Class or Volvo S90 systems have saved him? With their shorter intervals, at 60 mph? Maybe. At 80mph? Possible, but unlikely.

As of today, the driver remains responsible. Legally, philosophically, and morally.

I don’t think Brown would disagree.

I still believe Autopilot is the best such system, but like all technology, is only as good as our understanding of its limitations. Should Tesla castrate Autopilot? Absolutely not. It’s the last thing Brown—or anyone who believes AD will save lives—should want.

What Comes Next?

“So,” Comms texted me after the crash, “where’s the Elon Musk Just Killed 340,000 People story on The Drive?”

Good question. It’s not coming, at least not from me, because I no longer buy the argument.

Why? Because there’s a lot more to the story, from a brewing war among car manufacturers to “own” AD, to some very interesting comments from George Hotz of Comma.AI.

Alex Roy is an Editor-at-Large for The Drive, author of The Driver, and set the 2007 Transcontinental “Cannonball Run” Record in 31 hours & 4 minutes. You may follow him on Facebook, Twitter and Instagram.

stripe
Car TechElectric VehiclesNews by BrandSelf-Driving TechTesla News