Why Is Tesla’s Autopilot the Best Semi-Autonomous System? It’s All in the Name
What Tesla learned from the aviation industry.
Tesla presumably named its semi-autonomous driving suite "Autopilot" for a reason. After all, it's not a made-up bit of marketing jargon—it's the commonly-used term for the self-piloting technology found in aircraft.
In fact, let's take a look at one definition of an autopilot system, from Wikipedia:
“...[A] system used to control the trajectory of a vehicle without constant 'hands-on' control by a human operator being required. Autopilots do not replace a human operator, but assist them in controlling the vehicle, allowing them to focus on broader aspects of operation…”
Now, let's look at Tesla's own description of Autopilot:
"[The system] allows [the] Model S to steer within a lane, change lanes with the simple tap of a turn signal, and manage speed…[and] while truly driverless cars are still a few years away, Tesla Autopilot functions like the systems that airplane pilots use when conditions are clear. The driver is still responsible for, and ultimately in control of, the car. What's more, you always have intuitive access to the information your car is using to inform its actions."
These two descriptions sound pretty similar in terms of how humans are supposed to interact with the system. It was a stroke of genius on Tesla's part to turn the generic term into the brand name for their proprietary technology suite, but the downside for the wider world of automotive autonomy is that no other company can use the term. Generically speaking, any car company with a decent Level 2 advanced driver-assistance system has an autopilot; the problem is, the need to create branded naming around those systems makes things confusing. Elon Musk owns the most elegant solution: a branded name that also succinctly describes the basic functionality.
Here's the kicker, though: Based on its exceptional usability, I'd wager that the aviation industry system wasn't just an inspiration for the name "Autopilot," because those who designed Tesla's technology seem to understand something about autonomy that the rest don't.
The future of autonomous driving isn't binary. Between where we are right now and the fantasy of ubiquitous fully-autonomous, nap-in-the-backseat cars, there will be mixed modes where the autonomous system will operate under certain conditions and won't under others—basically, this is what the National Highway Traffic Safety Administration calls "Level 3" autonomy. As it stands now, many car companies think this level is best "solved" by skipping it altogether and waiting for the day that Level 4 magically arrives. This is ludicrous, and here's a quick story that explains why:
Once, in 1981, the ten-year-old version of me ran up to a pilot who was just exiting the plane's bathroom, tugged on his uniform pant leg, and asked if planes would ever be able to fly themselves 100 percent of the time. (I remember asking in the hope that the cool job of "airline pilot" would still be around in the future.) The pilot—tall, handsome, and as authoritative as Tom Skerritt’s character in "Top Gun"—looked down and told me, basically, "yes in theory, no in practice," and went on to explain that equipment failure might occur, and that a human was the best backup to that scenario. And since humans sometimes needed a bathroom break, or some sleep, or got sick, most commercial aircraft had two, and sometimes three people in the cockpit.
Thirty-five years later, this is still true for planes. And yet many automakers seem to think that everything short of 100 percent autonomy is pointless, if not dangerous. But from the analog autopilots of 100 years ago to the latest Boeing and Airbus technologies, the plane-based systems only function under certain conditions. They're limited by geography and weather, for example, and are restricted by the complexity of the conditions they face; how an autopilot system operates, or doesn't, is a function of location, task, and varying levels of human control.
Tesla, with its Fleet Learning and notoriously permissive technology, actually seems to understand this very basic point: human beings will always need to be involved in the operation of a motor vehicle. If not all of the time, then some of the time. Maybe less so in the future, but if the idea of mixed-mode autonomy hasn't been phased out in aviation, it's not going anywhere in the more complex system that is the world's highways, freeways, back roads, and residential streets. Differing times of day, passenger loads, geography, and weather—to say nothing of truly aberrant conditions—will call for different speeds, cornering behavior, braking, and more, and will sometimes call for varying levels of driver input.
Fully-autonomous cars, if even technologically possible, will never be deployable across 100 percent of the roads in first-world countries, let alone those in second- or third-world nations. (And cultural barriers may be as high or higher than legal ones; show me a map of gun ownership in the United States and I'll draft you a rough calendar of self-driving car adoption rates.) Legacy car companies that are hobbling their own automotive autopilots out of legal fears while simply waiting for the day that full Level 4 autonomy arrives are gifting Tesla with the opportunity to both develop the superior semi-autonomous technology and own the narrative.
In next week’s column, we’ll get into the differences between aviation autopilots, states of engagement, Tesla Autopilot 8, and the lessons learned on my most recent semi-Autonomous Tesla cross country road test, which didn’t take as long as I expected, and was done on Autopilot 97 percent of the time.
Oh, and I just booked my first flying lesson.
Alex Roy is an editor-at-large for The Drive, author of The Driver, and set the 2007 transcontinental “Cannonball Run” record in 31 hours & 4 minutes. You may follow him on Facebook, Twitter and Instagram.