7 Predictions For Tesla’s Autopilot 8

Warning: Some of these will be controversial.

byAlex Roy|
7 Predictions For Tesla’s Autopilot 8
Share

0

Any minute now, Elon Musk will reveal details of the biggest update to Tesla’s Autopilot suite since its release in October of 2015. He was supposed to do this last Wednesday, then, when that didn't come to pass, over the weekend. I’m sure he had bigger fish to fry, which gave me more time to digest the lessons of my most recent Tesla cross-country drive, and contemplate what we can expect to see in Autopilot 8.

Without further ado:

Improved Situational Awareness Display

Might this not mean better object recognition, with a display drawn to scale? More accurate icons, including pedestrians? How about making the secondary nav and power displays smaller, and the central display bigger?

Mode Confusion/Alerts/Involuntary Disengagements

I’ve lumped mode confusion, alerts, and involuntary disengagements together because they clearly go hand-in-hand-in-hand. Mode confusion is blamed for 90 percent of all plane crashes attributed to human error. As good as Autopilot 7 is, its state of engagement feedback—that is, the user’s awareness of whether or not it’s actually engaged—is mediocre at best. But that's better than everyone else’s, which is non-existent. After 7,000+ miles of Autopilot 7 usage, I’ve only ever had one mode confusion incident, and that was when I thought the system was on when it wasn’t.

Again, even though Autopilot 7 is state of the art, its state of engagement notification is limited to two pairs of pleasant sounding chimes (ascending or descending based on engagement or disengagement) and a little grey or green steering wheel to the right of the speedometer. Not good enough.

I’m not suggesting Christmas lights here, but man, anything between Christmas lights and what we currently have would be safer. I'm not exaggerating when I say the entire dash should pulse green when Autopilot is engaged, and the phrase "Autopilot Engaged" should scroll across it every 20 seconds. Better yet, install a heads-up display and have that also pulse in the driver’s line of sight. Do those things—do anything—before the government mandates something we all hate.

Oh, and when the system disengages, make the entire dash flash red and sound an unpleasant horn. Over and over. Shocking? It should shock. It will also make it less likely people will try to engage the system in less-than-ideal conditions. Is it unsafe to shock? It’s safer to shock once and have people remember it than it is to crash and have people remember that. (See number 3: Conditional Prohibitions.)

Re-engagement Prohibitions

We’ve already seen a leak about this, and it’s exactly what I hope Musk will give us. If a driver ignores the disengagement warnings in Autopilot 7, the car will eventually come to a stop. Fine. What’s allegedly coming in 8 is that the user won’t be able to re-engage Autopilot until the car has been placed in Park, then back into Drive.

I have a strong suspicion that most of the accidents blamed on Autopilot occurred because drivers kept trying to re-engage in less than ideal conditions after numerous hands-off warnings. This should solve that. I hope.

This is similar to rules baked into Airbus aviation autopilot systems, which I will be addressing in an upcoming article. We need a lot more aviation autopilot logic pulled into automotive, and this is a huge (if unsexy) step in the right direction.

AutoSteer Feel/Torquing

The Autosteer aspect of Autopilot puts too much torque into the steering. Call it too much of a good thing. Everyone else puts too little. I want to be able to change lanes within Autopilot, but currently, any steering input while engaged disengages Autopilot altogether. Therefore, in order to keep it engaged I often have to remove one hand from the wheel where, at rest, it might disengage the system.

Loosen this up, just a little. Say, 10 percent.

Geo-Capping/Conditional Prohibitions/Fleet Learning

This one seems obvious. Weather, darkness, and direct sunlight all have negative effects on Autopilot, greatly increasing the likelihood of an involuntary disengagement; therefore, Autopilot should not be engage-able in these conditions. I’ve engaged Autopilot in moderate rain at 90 mph. I’ve also chosen to dial it back to 30 mph for my own safety. It appeared to work perfectly—emphasis on "appeared to." The average person is unable to judge whether such conditions are suitable for Autopilot, and shouldn’t be allowed to use such a technology until it matures, and both driver education and product training improves.

The myriad scenarios in which such limitations should be imposed would fill volumes, but Tesla’s Fleet Learning must be capable of correlating historical weather, light levels, and sun placement with disengagements, driver inputs, and accidents.

Expect to see a lot more Fleet Learning data integration—if not in Autopilot 8, then in interim updates. That’s the whole point of Fleet Learning, and where Tesla has an advantage. For now.

Geo-Fencing

This one also seems obvious, probably because it’s kin with number three. The highway on which Joshua Brown’s accident occurred had perpendicular access roads lacking 4-way lights or stop signs. Would capping Autopilot’s speed on that road have saved Brown’s life? Maybe not, but it certainly wouldn’t have hurt his chances of survival, and if Brown had been forced to take the wheel because he personally felt the need to speed, he’d have been more likely to avoid the accident.

Likewise, Autopilot isn’t intended to work in Manhattan traffic, so why should it be possible to engage it within our snarled grid system? I consider myself a bit of an expert on Autopilot. Really, I love it, but that just means I know more about its limitations than the usual critics, most of whom have somewhere between zero and one hour behind the wheel.

Until Autopilot sees lights and signs, it shouldn’t be possible to use it in Manhattan unless you’re on the FDR Drive or the West Side Highway north of 57th street. For the record, I did the entire FDR Drive last week on Autopilot, in traffic, with only two disengagements. It was brilliant. (Video coming soon.)

Anywhere there’s a happy Tesla owner, there’s an area they know Autopilot shouldn’t be used, at least right now. Let’s see some more Fleet Learning applied, maybe specifically toward new owners.

Ooops, I think I just opened a can of worms. Well, here’s a bigger one:

Speed Limits

I have no problem with Autopilot operating up to the speed limit. Due to the artificially low limits in many areas and the ubiquity of traffic flowing 5-10mph over the limit, Autopilot should have a variance capable of matching traffic flow—but only in the presence of such traffic. (Google’s Level 4 logic will allegedly have such a variance.) If no such traffic is present, Autopilot should be limited to the local speed limit.

Is Tesla’s current sensor suite capable of detecting traffic flow for this purpose? At the speeds we're talking about, not to the rear. The feasibility and logic of this deserves its own article, but something must be done to balance out the convenience of Autopilot with safely staying in the flow of traffic.

I say all of this as a huge fan of Autopilot, and Musk’s efforts to advance this technology. Better that Musk should address this than the government.

And the odds of my being right are about...

Let's say fifty-fifty. Guessing is the fun part. I can’t wait to see you in the comments.

Alex Roy is an Editor-at-Large for The Drive, author of The Driver, set the 2007 Transcontinental “Cannonball Run” Record in 31 hours & 4 minutes, and many other records that have yet to be announced. You may follow him on Facebook, Twitter and Instagram.

stripe