Watch Tesla Autopilot Head Toward Location of Fatal Model X Accident
A gentle reminder that Autopilot isn't perfect.
The unfortunate accident that occurred two weeks ago in California continues to throw curveballs at Tesla's autopilot since it was announced that the accident occurred while the semi-autonomous feature was engaged. A recent video posted to Reddit from Youtube shows how the latest update to autopilot responds to the area where the crash occurred. Spoiler: it doesn't look good.
After deciding to test how the vehicle would respond, the Redditor took his Tesla Model S to the section of California highway 101 where the crash occurred and put on Autopilot. Within seconds, the vehicle can be seen veering off towards the barrier.
Later, it was reported that the same stretch of road was trialed twice more over the next two days and produced the same results each day.
Another video on Reddit shows a similar scenario occurring, but also seemingly helps to reveal what might be going wrong. Autopilot appears to be attempting to center the vehicle between the two outer-most lane markings while ignoring the split in the center of the road. Other users have reportedly experienced similar phenomena when using the latest software update from Tesla that seemingly did not occur in earlier versions.
"[It] works for six months with zero issues. Then one Friday night you get an update. Everything works that weekend, and on your way to work on Monday. Then, 18 minutes into your commute home, it drives straight at a barrier at 60 mph," said Reddit user, /u/beastpilot, "It's important to remember that it's not like you got the update five minutes before this happened. Even worse, you may not know you got an update if you are in a multi-driver household and the other driver installed the update."
Other users on /r/TeslaMotors called for Tesla's need to immediately roll back the latest update, while some bring up the point that people really shouldn't be testing this feature out in fear of another accident occurring. The NTSB previously stated that it was unhappy with Tesla CEO Elon Musk revealing details of the accident ahead of the conclusion of the accident's investigation, and now it might be clear why the board took that stance.
We reached out to Tesla regarding the video and the company firmly reiterated that Autopilot should always be used with an attentive driver, which may have not been the case in the fatal Model X crash, as data revealed that the driver was not responsive to cues for six seconds prior to the crash.
"If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident," a spokesperson told The Drive, quoting from the company's own blog. "Tesla Autopilot does not prevent all accidents—such a standard would be impossible—but it makes them much less likely to occur."
Tesla stands by its claim that Autopilot is not (yet) a fully autonomous system and that drivers need to be in control of their vehicles at all times. Failure to respond to Autopilot's request for attention will disable the function until the vehicle is restarted, though people continue to find ways to outsmart the system.
RELATEDNTSB 'Unhappy' With Tesla For Releasing Information About Fatal CrashElon Musk defended the move, declaring 'To do otherwise would be unsafe.'READ NOW
RELATEDTesla Driver Allegedly Reported Autopilot Issues to Dealer Several Times Prior to Fatal CrashThe driver was warned for up to six seconds before the crash, Tesla said in a release.READ NOW
RELATEDTesla Admits Autopilot Was Active In Fatal Model X CrashThe driver, though using Autopilot, had visual and audible warning prior to the crash that claimed their life.READ NOW
RELATEDTesla Investigating Fatal Model X Crash, Releases StatementIt is unknown if Autopilot played a role in the accident.READ NOW
RELATEDTesla Model X Crash Ends In Fatal Accident For Mountain View DriverAs with most tragic accidents, we are left with more questions than answers.READ NOW