Consumer Groups Demand FTC Investigation Into Tesla Autopilot
The Center for Auto Safety and Consumer Watchdog mailed the Federal Trade Commission their grievances against the controversial driving aid today.
On Wednesday, the Center of Auto Safety and Consumer Watchdog mailed a joint request to the chairman of the Federal Trade Commission, Jospeh Simons, requesting that the FTC investigate how Tesla Motors has marketed its controversial "Autopilot" semiautonomous driver aid suite.
In the letter, the two organizations accuse Tesla of "deceiving and misleading consumers into believing that the Autopilot feature of its vehicles is safer and more capable than it actually is." The groups cite two known deaths and one injury as a result of drivers relying on Autopilot to control their vehicle as reason to investigate the marketing of Autopilot. They insist that the FTC examine Tesla's advertising practices surrounding the feature to determine whether Tesla can be faulted for its customers' misuses of Autopilot.
The two fatal accidents to which the letter refers are the collision between a Model X and a highway divider in March and the 2016 accident in which a Tesla Model S drove under a tractor trailer with Autopilot active. The injury cited in the letter is in reference to an accident earlier this month, when a Utah woman broke her ankle as a result of rear-ending a fire engine at 60 mph, her attention diverted to her phone.
In addition to the above two, another fatal accident occurred over the weekend, when a Model S left the road and ended up at the bottom of a pond. The investigation is ongoing, and though authorities have not issued a statement on the cause of the crash, ABC7 reports that Tesla has found Autopilot was not active. The Drive discussed this accident via email with Jason Levine, executive director of the Center of Auto Safety, and Consumer Watchdog official John Simpson, both signatories of the letter to the FTC.
Levine expressed concern as to whether the deceased driver was in full control of their vehicle at the time.
"Presuming that was the case," stated Levine, in reference to the news that Autopilot was inactive, "it leaves open many questions that were the reason we have asked the FTC to investigate in our letter today. Did the decedent believe Autopilot was engaged? If so, did he believe it was 'self driving?' Did he take it out of 'Autopilot' mode just before the crash? Is Tesla cooperating in terms of supplying crash data?
"Even if this particular case was an unavoidable tragic crash," continued Levine, "enough evidence exists that consumers are being misled by the messaging that comes not only from Tesla as a company, but from Mr. Musk himself both via social media and in his interviews with respect to the capabilities and shortcomings of the feature."
Simpson called for acceleration of the investigation into the weekend's crash, and decried Tesla's marketing of the Autopilot feature.
"Reports out today say that 'Autopilot' was not engaged when this crash occurred," stated Simpson. "It's imperative that as many details about what happened are released by the investigating authorities as soon as they are available. However, Tesla's aggressive and deceptive marketing of Autopilot has resulted in other deaths and injuries. It has to stop. That's why Consumer Watchdog and the Center for Auto Safety filed our complaint with the FTC."
Levine took a conservative, albeit inquisitive stance as to whether Tesla's advertising for "Autopilot" adequately conveys the system's capabilities and limitations.
"Driving can be dangerous, with over 37,000 deaths and over 2 million serious injuries from traffic crashes in the U.S. every year. No matter the car—or the features—drivers need to be paying attention and understand what their vehicle can and can’t do. What happened in this instance remains under review. The larger issue is ongoing: do consumers understand the risks associated with these features, and are manufacturers sufficiently educating their customers?"
The pair's letter to the FTC closes with a plea to look into this topic.
"In the case of the sole accident the NTSB was able to fully analyze, the Board attributed a lack of understanding of Autopilot's capabilities to the death of one of these consumers," state the two in their letter to the FTC.
"The burden now falls on the FTC to investigate Tesla’s unfair and deceptive practices so that consumers have accurate information, understand the limitations of Autopilot, and conduct themselves appropriately and safely. The Center for Auto Safety and Consumer Watchdog urge the FTC to conduct a timely investigation in order to prevent further tragedies."
The Drive contacted Tesla Motors for comment on the above accusations, to which a spokesperson returned comment.
"The feedback that we get from our customers shows that they have a very clear understanding of what Autopilot is, how to properly use it, and what features it consists of."
- RELATEDTesla Gets Involved After Model S Drives Into Pond, Killing DriverAuthorities and Tesla Motors are cooperating to determine the cause of a crash that killed a California man.READ NOW
- RELATEDThe Language of Self-Driving Cars Is Dangerous—Here's How To Fix ItCurrent SAE Levels conflate automation with autonomy. That's a fundamental flaw—so let's ditch the levels entirely for something better.READ NOW
- RELATEDElon Musk Reportedly Rejected Driver-Monitoring for Tesla Autopilot—But Why?Tesla needs to install a driver-monitoring system before it's forced to—or Euro NCAP kicks in.READ NOW
- RELATEDUtah Tesla Driver Turns on Autopilot, Rams Fire Truck at 60 MPHThe latest high-profile Tesla crash gives us a strong sense of déjà vu.READ NOW
- RELATEDTesla Sees Autopilot Usage Decrease After High Profile CrashesEvery time a driver points the finger at Autopilot for their accident, Tesla owners scale back their use of the feature.READ NOW