Consumer Reports Calls Out Tesla’s Autopilot, Tesla Doesn’t Care

And the automaker is exactly right.

byAlex Roy|
Self-Driving Tech photo
Share

0

Consumer Reports is trying to kill you. It’s true: The non-profit my parents trusted for advice on washing machines, coffee makers and sunscreen has crossed the line. CR, an organization claiming to serve consumers through unbiased product testing has chosen to enter the debate over Tesla’s Autopilot in the most ill-informed and irresponsible way possible. In doing so, the brand is putting additional lives at risk.

No company has done more to bring autonomous driving (AD) to market than Tesla, and yet they are now the target of a misinformation campaign rapidly coalescing around a single message: Tesla Autopilot is dangerous.

This is nonsense.

Drivers using Tesla Autopilot are already safer than those who don’t, and yet less than two weeks after the announcement of the first known driver fatality that occurred while Autopilot was engaged, CR joined the chorus ignorant of how AD (and specifically Tesla Autopilot) actually works, and issued Tesla a de-facto ultimatum: Disable or castrate Autopilot, or else.

Tesla’s response: Drop dead.

Tesla’s response to CR was 100 percent correct, whatever your opinion of Elon Musk’s alleged misdeeds or PR skills. The National Highway Traffic Safety Administration's and National Transportation Safety Board's investigations have barely begun, and yet CR—with few facts and limited understanding—would use their media influence to hobble a demonstrably superior safety technology before the government can properly judge its merits.

But by what standard can such technologies be judged? There's only one: Research suggests AD will reduce traffic fatalities by up to 90 percent. If even a single life can be saved, AD has value, and any form of it is a good thing. The only moral debate, then, is one over how to make AD available as quickly as possible, to the broadest possible audience.

Tesla's Autopilot is not perfect, but it’s already better than all the alternatives—the most popular being not using it at all, which is why we have 3,287 car crash fatalities a day, 105 of them in the United States. That’s 1,200,000 deaths a year, 34,800 in the United States, entirely from human driving.

CR may be a non-profit, but they’re acting very much like a for-profit, betraying their mission statement by placing their once-respected flag in a clickbait swamp.

Let’s review the genesis of this alleged debate, dissect CR’s “request” to Tesla, and talk about what should be done to improve Autopilot and further increase safety.

The Autopilot “Accidents”

The anti-Tesla Autopilot firestorm was kicked into high gear by Joshua Brown’s fatal accident on May 7th in Florida, followed by two other, non-fatal crashes in Pennsylvania and Montana.

However, on further inspection, it's likely that these crashes, like the overwhelming majority of car accidents, are due not to some kill-crazy technology, but to driver error.

The fatal Florida crash cannot reasonably be blamed solely on Autopilot—perhaps not even partially. It might be blamed on a negligent truck driver, but it was most likely the fault of Tesla fan Joshua Brown, who knowingly failed to pay attention due to overconfidence in a system whose limitations were well known to him. If he had been paying attention, he almost certainly could have avoided the accident.

The driver in the Pennsylvania crash was cited for careless driving. How does one drive carelessly on Autopilot? Anyone who’s used it knows its lane centering is exceptional, vastly superior to competing systems, and is more consistent than most human drivers, as demonstrated by Motor TrendCar & Driver, Autofil and our own impending comparison with the 2017 Mercedes E-Class. Either the driver was speeding, lost control, and wants to blame Autopilot, or the driver engaged Autopilot at too high a speed and didn’t have his hands on the steering wheel, as Tesla recommends.

The circumstances of the Montana crash are unclear, but the driver said he would buy another.

The first two incidents appear to have negligence in common; in the third, it sounds like the driver doesn't blame the car (so, perhaps, negligence was also in play).

Where negligence has a will, negligence finds a way, which is why three out of four of CR’s recommendations won’t improve safety, and may in fact reduce it. Speaking of . . .

The Consumer Reports “Request”

  • Disable the Autosteer function of the Autopilot system until it can be reprogrammed to require drivers to keep their hands on the steering wheel
  • Stop referring to the system as “Autopilot” as it is misleading and potentially dangerous
  • Issue clearer guidance to owners on how the system should be used and its limitations
  • Test all safety-critical systems fully before public deployment; no more Beta releases

To which Tesla—in the typical PR speak that rarely works in such cases—responded:

“Tesla is constantly introducing enhancements proven over millions of miles of internal testing to ensure that drivers supported by Autopilot remain safer than those operating without assistance. We will continue to develop, validate, and release those enhancements as the technology grows. While we appreciate well-meaning advice from any individual or group, we make our decisions on the basis of real-world data, not speculation by media.”

What else could Tesla say? Much more, if they understood that the audience for any such response isn’t Consumer Reports, but the world of people who don’t own a Tesla and don’t yet fully understand the technology, terminology, or the firestorm surrounding them.

The Folly of Restricting Autosteer

Autosteer is exactly what it sounds like, and combined with Adaptive Cruise Control (ACC), forms Tesla’s Autopilot, Tesla's proprietary name for what NHTSA calls Level 2 Autonomy. In other words, partial Autonomy, requiring human oversight at all times—just in case.

Tesla is unfairly being singled out, and for all the wrong reasons. The latest Mercedes and Volvo models possess hardware similar to Tesla’s, with functionality almost identically branded. Autopilot = DrivePilot = Pilot Assist. AutoSteer = Steering Pilot = you get the idea. All such systems have what let’s call the Hand-on-Wheel Interval. Mercedes's Drivepilot’s maximum interval? Sixty seconds. Tesla Autopilot’s maximum interval? Up to ten minutes.

CR is suggesting Autopilot’s interval is the problem, and that Autosteer should either be completely disabled, or disabled unless the driver has a hand on the wheel. But suppose Tesla reduces their interval to sixty seconds; a cursory analysis of the conditions, speeds, and distances in the Joshua Brown accident indicate even that wouldn’t have prevented the accident. You know what would have? A driver that was paying attention.

Here’s a data point. Joshua Brown had eight(!) speeding tickets. This doesn’t sound like the kind of person for whom disabling Autopilot and forcing his hands on the wheel would have made a difference.

I’ve just spent a long week with a Tesla and the latest Mercedes E-class, and a month with the latest Volvo. Nearly 10,000 miles of testing later, it’s clear that CR has this backward. These systems may all be grouped under Level 2/, but they are not the same. Each have unique characteristics, inspiring varying levels of confidence depending on speed, light, traffic and weather; each react differently to steering inputs when in semi-Autonomous mode; each brake differently, and respond to other cars differently.

All three would be vastly safer if the government would only define a best-of-breed set of common behaviors and features such that users would know what to expect. Feedback is necessary, which means the more users the better, which is why the Tesla system, with 90,000 users gathering data via their networked Fleet Learning platform, feels so much more advanced, and safer.

But this isn’t what CR is suggesting. CR would stop all this now, until some fantasy testing and development happens behind closed doors, or in simulators. The fact that it might take ten or twenty extra years, and cost countless more lives, doesn’t seem to be of consequence.

No One Thinks Autopilot Equals Self-Driving Car

CR would have you believe Tesla is tricking people into a false sense of security by using the brand name “Autopilot” for a system that isn’t actually an autopilot.

I’m not a pilot, nor have I ever commanded a ship, but I know that planes and ships have captains. I also know they have autopilot systems. I know this because I’ve been on a plane and seen the flight crew go to the bathroom. I’ve also seen the flight crew return to the cockpit and sit in chairs facing forward, with flight controls within arm’s reach. At no point in my life have I ever assumed planes or ships were fully autonomous, because where there is a crew, there is usually a need for one.

CR’s suggestion is an insult to the overwhelming majority of Tesla owners who drive several million miles on Autopilot every day, and serves only to lower the bar when conversing about AD. By unfairly painting Autopilot as unsafe, CR is weakening potential demand for an underdeveloped market any consumer advocate should be keen to fertilize, even now, in its technological infancy.

By failing to help clarify misleading media coverage, CR has effectively joined in propagating the notion that Tesla owners are victims, and that one merely gets into a Tesla, hits a button, and the car drives itself until it makes a mistake.

No one who owns a Tesla thinks it’s a fully self-driving car. Learning this is as easy as trying to engage Autopilot the first few times, then watching it disengage.

Although it is possible to briefly engage Autopilot at speeds and under conditions that may be irresponsible, it is actually extremely difficult, requiring repeated trial-and-error. All of these errors, engagements, and disengagements are geolocated and correlated to the myriad sensor data gathered by 90,000+ Teslas at a rate of 3,000,000 miles a day, comprising 140+ million miles of data to date.

Every day, Tesla Autopilot gets a little better. Is that something CR is opposed to?

Drivers Using Autopilot Are Safer Than Those Who Don’t

Even if Tesla stopped development today, Tesla Autopilot users are better off. Here are the figures:

Human Driving = 1.08 deaths per 100 million miles driven

Human Driving + Tesla Autopilot = .76 deaths per 100 million miles driven

If even one life is saved by the use of Autopilot or similar systems, their deployment in any form is morally justified, and the only debate must be over the minimum standard of safety the law can and should require over and above doing nothing. Knowing what little we know now, any restriction of Autopilot is, by definition, a step back, unless it can be shown to lower fatalities. Because those most likely to have an accident while using Autopilot—anecdotally, negligent drivers—are even more likely to have an accident without it.

The Folly of “No More Beta Releases”

Do the editors of CR read their own magazine? Virtually every car ever made, with the exception of certain Lexus models, has been a Beta release. Had a recall? Beta release. Had a warranty repair? Beta release. Manufactured in Italy? Beta release. English wiring harness? Beta release. German infotainment system? Beta release. American ignition system? Japanese airbag?

All Beta releases. All packaged as final versions.

Tesla is the first company to offer a wirelessly upgradeable car. Not when you visit a dealer trying to upsell you. Not every five or seven years. Tonight. While you’re sleeping. For free.

Tesla’s software will always be a Beta release, because they’re a tech company, and their attempt at disrupting the automotive business depends on constant upgrades.

Is CR suggesting it would be better to restrict any and all software updates to the traditional 5-7 year model replacement cycle, and have the government review the code?

The most optimistic projections for AD suggest a 90% reduction in traffic fatalities. Mistakes will be inevitable.

Is CR suggesting 90% is insufficient, and that unless a 100% reduction in fatalities can be guaranteed, we should be satisfied with 38,400 fatalities a year?

How does CR suggest manufacturers get to that level of certainty?

How can CR justify restraining Tesla’s development path when, less than one year after going on sale, Autopilot has already surpassed Human Driving in terms of safety?

What Consumer Reports Got Half-Right

CR is absolutely correct about the need for manufacturers to educate users on their system’s capabilities and limitations.

How To Improve Autopilot

There is a clear path to improving Autopilot, but it’s the opposite of what Consumer Reports suggests in every way but one. The key is to balance the driver’s situational awareness and engagement with conditions.

1) Improve situational awareness: Users should know, at all times, what the car sees. This is where the Tesla is vastly superior to every other iteration. The Autopilot dash display is decent—clearly displaying other cars and lane lines—but still not large enough. The audible alert for engagement is decent, but the disengagement alert should be 100 percent louder. Meanwhile, the comparable Mercedes system is insufficient to the point of being dangerous.

2) Limit engagement based on location: If Autopilot has disengaged too frequently at certain locations, and/or users have attempted to engage it at certain speeds in locations where it would be unsafe even under human control, Autopilot should be permanently locked out at those locations at those speeds.

3) Limit engagement based on weather: And based on speed, as Autopilot may be necessary to ensure safety at lower speeds, especially for less confident drivers.

4) Dynamically change the interval based on location: A short interval often makes me want to disable Autopilot altogether, even in conditions where leaving it active would be safer. Intervals should be based on conditions, never on time, to maximize usage rather than deter it.

5) Change disengagement resistance: Autopilot is very sensitive, disengaging with the mildest touch of the steering wheel, which is jarring (and potentially unsafe) when my only goal is to keep it engaged rather than take over myself.

6) Slow it down, but only to the flow of traffic or 10 percent over: Autopilot currently functions up to 90 mph on most interstates, with speed restrictions on secondary roads varying by location. Autopilot should be limited to conditions. If restricted to the speed limit, most users (like myself) won’t use it at all. A dynamic limit matched to flow of traffic might be safer in some conditions, but only if it doesn’t deter use.

7) Test the Driver: Credit the brilliant Brad Templeton for this suggestion: Autopilot should test it’s user with periodic, random disengagements. Fail to take control in time, too many times? Your Autopilot is disabled for some arbitrary period of time. Like a game, albeit one that will save your life.

Here's the thing: if Tesla enables the last condition—Driver Testing—numbers one through six probably aren't necessary.

For the record, although I am pro-autonomy, I steadfastly believe in protecting human driving freedoms. I’m a little less capable every day, and hope fully Self-Driving Cars arrive before I have to give up my license due to age. If we let Consumer Reports and mass hysteria stifle innovation, that may not happen, I'll still be driving, and I may be joining those unfortunate 38,400.

But I'd rather not.

Alex Roy is an Editor-at-Large for The Drive, author of The Driver, and set the 2007 Transcontinental “Cannonball Run” Record in 31 hours & 4 minutes. You may follow him on Facebook, Twitter and Instagram.

stripe
Car TechNews by BrandSelf-Driving TechTesla News