Tesla Uses Autopilot Data to Defend Itself, But What About Driver Privacy?
Tesla only releases data when it suits its purposes, a new report claims. The carmaker says it's just setting the record straight.
When a Tesla electric car using the automaker's Autopilot system crashes, Tesla likes to turn to in-car data to dismiss claims that its tech is at fault. But what about the customers who generate that data?
While Tesla is quick to use Autopilot data to counter claims of faults or glitches, it is less eager to release to its customers—or even seek their permission before releasing it, according to The Guardian. The newspaper said it could not find a single case in which Tesla sought permission before releasing data to the media when Autopilot was suspected to be at fault in a crash.
The Guardian also discovered one case in a which Tesla explicitly denied an owner's request to see data from his own car. A Swiss driver, who spoke on condition of anonymity, wanted access to car data after his Model S collided with a van on the highway. While he considers himself a "Tesla fanboy," the driver said he was concerned about being denied data that he could use to defend himself in court.
The Swiss Model S owner had requested data logs from his car, but Tesla has not released anything quite as extensive publicly. The carmaker typically releases specific pieces of information to counter what it views as unfair or inaccurate claims about Autopilot made by owners. Those disclosures have included revealing that a Montana Tesla driver did not have his or her hands on the wheel during a June 2016 crash, and that a California driver deactivated Autopilot by pressing the brake pedal, resulting in a collision the driver blamed on an Autopilot fault.
"Autopilot has been shown to save lives and reduce accident rates, and we believe it is important that the public have a factual understanding of our technology," Tesla said in a statement defending its practices.
"In unusual cases in which claims have already been made publicly about our vehicles by our customers, authorities, or other individuals, we have released information based on the data to either corroborate or disprove these claims," the automaker said. "The privacy of our customers is extremely important and something we take very seriously, and in such cases, Tesla discloses only the minimum amount of information necessary."
What's clear is that Tesla considers setting the record straight on Autopilot to be vitally important. Despite its name, Autopilot is not a truly autonomous system; it's closer to the bundles of driver-assist features offered by other automakers. But the name "Autopilot" has led to some confusion among customers about the system's actual capabilities. Autopilot was widely criticized after a fatal May 2016 crash involving a Model S running the system, but a National Highway Traffic Safety Administration (NHTSA) investigation cleared Tesla of any wrongdoing.
UPDATE: Tesla told The Drive that, in the case of the Swiss driver mentioned by The Guardian, it provided all information necessary under the Swiss Data Protection Act, and did not release any information from that incident to the press.
- RELATEDTesla Rolls Out Autopilot Upgrade for Newer VehiclesRecently-built Model S and Model X Teslas can now steer themselves at up to 80 mph.READ NOW
- RELATEDTesla Model X, Reportedly on Autopilot, Hits Police Motorcycle in ArizonaThe incident occurred on Utopia Road.READ NOW
- RELATEDNHTSA Closes Investigation into Fatal Tesla Model S Autopilot Crash; No Recall IssuedTesla CEO Elon Musk took to Twitter to speak about the agency's findings.READ NOW
- RELATEDWatch a Tesla Model S Crash Into a Barrier on AutopilotReminder: always pay attention when using Autopilot.READ NOW
- RELATEDWatch This Tesla Autopilot 2.0 Fail Terribly in a Model SThis Tesla Model S Autopilot 2.0 failure is a bracing reminder that humans should still drive cars—for now.READ NOW