Tesla Recalls 362,000 Cars Because Full Self-Driving ‘May Cause Crash’
It follows a recall last year of 54,000 Teslas equipped with the driver assistance feature.
Tesla will recall more than 362,000 vehicles equipped with its Full-Self Driving suite of driver-assistance features that could cause a crash, the NHTSA said Thursday.
The recall covers 2016-2023 Tesla Model S, Model X, 2017-2023 Model 3, and 2020-2023 Model Y vehicles equipped with (or pending installation of) Full Self-Driving software. The announcement says the vehicles could be fixed via over-the-air updates.
It's unclear from the recall announcement if the recall is related to multiple investigations by U.S. regulatory agencies regarding the FSD features that have been linked to multiple crashes. According to a letter sent to Tesla from the NHTSA, the "FSD Beta system may allow the vehicle to act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution. In addition, the system may respond insufficiently to changes in posted speed limits or not adequately account for the driver's adjustment of the vehicle's speed to exceed posted speed limits."
"FSD Beta software that allows a vehicle to exceed speed limits or travel through intersections in an unlawful or unpredictable manner increases the risk of a crash," regulators also wrote.
The recall follows a similar action last year when Tesla recalled 54,000 cars for rolling through stop signs when Full Self-Driving was engaged.
Tesla's driving assist software is under investigation by the Department of Justice and other federal agencies after a string of crashes involving the software. Though Tesla has touted FSD as capable of operating the vehicle on its own, the company's website states FSD must be constantly monitored by drivers. Its CEO Elon Musk, who directly oversaw a video overselling FSD's capabilities at the time, stated that FSD "could do the wrong thing at the worst time."
Got a tip? Send it in to email@example.com