The aftermath of this weekend's fatal Tesla Model S crash in which authorities allege the two victims were riding in the passenger seats with no one behind the wheel continued to unfold on Monday when two separate federal agencies announced they were sending special investigation teams to the Houston-area crash site. Meanwhile, Tesla's silence on the wreck was broken by CEO Elon Musk (who else) on Twitter (where else), as he claimed that data recovered from the car "so far" don't show that the company's Autopilot driver-assist software was enabled and that the residential road where the Model S crashed wouldn't allow for it to be activated in the first place. And following that tweet, local authorities said they're issuing a search warrant to get a look at that data.
A lot to unpack there. Both the National Highway Traffic Safety Administration and the National Transportation Safety Board are now investigating, and no matter your opinion of the feds, the quick response here is adding to the buzz that the government might finally be preparing real regulatory action over semi-automated driving systems like Autopilot. Though it's an extreme example, this weekend's tragedy in Texas is just the latest such incident that the feds have probed; the NHTSA has well over 20 active investigations into Tesla crashes at the moment, with four being opened in the past two months.
Harris County officials say they're "100 percent certain" that neither of the two men who died in the crash were sitting in the driver's seat when the 2019 Tesla Model S flew off a residential road in a Houston-area subdivision on Saturday night and struck a large tree, immediately exploding in flames that took firefighters hours to subdue. One victim was found in the passenger seat, and the other in the back seat. Police claim this is also backed up by witnesses, adding that the deceased, one of whom owned the Tesla, were also overheard discussing its Autopilot functions just minutes before hopping in the car for an apparent joyride around the neighborhood.
Simply put, there is a boatload of circumstantial evidence but so far no hard evidence that the Tesla was operating on Autopilot when it crashed. Which of course means nothing, legally speaking. And if the car is mostly incinerated, there's a chance the onboard data that would give us the answer is completely lost, and all we'll have are Tesla's internal records and its word that those are accurate. That's probably why Musk decided to issue his first statement on the crash in response to a random Twitter user questioning the evidence presented in a Wall Street Journal story.
"Your research as a private individual is better than professionals [at WSJ]! Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD," Musk wrote. "Moreover, standard Autopilot would require lane lines to turn on, which this street did not have."
Tesla's eternally ardent defenders online are taking this as a firm denial; we'd note the words "so far" are doing a lot of heavy lifting, as does the unspoken technicality that he's claiming Autopilot wasn't enabled at the moment of collision. What about a few seconds before? As for the idea that standard Autopilot wouldn't allow itself to be used on the road in question or wouldn't work without anyone in the driver's seat, it doesn't take much searching to find videos of people easily abusing the system in dangerous ways, or videos of Autopilot being activated on unmarked roads. The safeguards are clearly not enough—unlike GM's Super Cruise, Autopilot doesn't monitor the driver's presence beyond requiring the seatbelt to be fastened and a tug at the wheel every 30 seconds, nor is it restricted to physically mapped roads.
That we might only have Tesla's word for the truth in this crash is also why late Monday afternoon, Harris County authorities announced they'll be serving Tesla with a search warrant on Tuesday to obtain and secure that data, according to Reuters. This week will be interesting.
Got a tip? Send us a note: email@example.com