My Infant Daughter’s Life Shouldn’t Be a Variable In Tesla Autopilot’s Public Beta

Neither she nor you signed Tesla's Autopilot Terms and Conditions.
www.thedrive.com

Share

I spotted the crimson Tesla Model 3 immediately as it began merging onto the 101 highway; years of riding motorcycles in Los Angeles has made my peripheral vision razor-sharp. With my infant daughter sitting in the back, singing along with my wife and brother to some kid tune, time slowed as the Model 3 failed to heed our presence in the right lane and aimed straight for our passenger doors. With inches to spare, I swerved into the unoccupied left lane and narrowly avoided an accident. Concurrently, in that split second, I saw the Model 3’s driver’s hands jump from his lap to the wheel and yank it to the right. The car was on Autopilot.

It’s time to regulate this technology.  

Tesla CEO Elon Musk has scoffed whenever the intrinsic danger of using the public as a mass beta test is brought up. The company’s communications team states that Tesla drivers have safely logged more than 1 billion miles using Autopilot. That preponderance of data supposedly demonstrates that the advanced driver-assistance system (ADAS) is perfectly safe for consumer use—despite an increasing number of Autopilot-related crashes of late. Yet, even if we were to take Tesla at its word, it doesn’t discount the fact that we’re being used to test the company’s software and hardware for limitations and bugs. 

message-editor%2F1559085793664-webp.net-resizeimage-2019-05-28t162249.975.jpg
Jonathon Klein

My daughter, your daughter, your son, your wife, your husband, your brother and sister, your father and mother, every single person who shares the road with an Autopilot-equipped car are in essence Tesla’s lab rats. What’s a few deaths when you’re advancing technological progress? 

Tesla covers its ass by giving those “futurists” willing to use Autopilot—again, not a fully autonomous vehicle—a Terms and Conditions prompt before drivers are able to engage the system. The dialogue-box informs drivers that they need to agree to “Keep your hands on the steering wheel at all times and to always maintain control and responsibility for your vehicle.” Yet, unlike the Terms and Conditions we accept on a regular basis—those that almost no one ever reads—its effects can reach beyond the user. There are in fact other people on the road who haven’t given their tacit agreement to be beta testers, like my daughter. No amount of Tesla legalese can refute that. 

Reality hasn’t yet tested the company, as the fatal crashes associated with Autopilot’s use have only involved the death of the vehicle’s occupants who’ve waived Tesla’s liability through the prompts. Yet Musk’s public comments and misinformation—along with an increasing number of Teslas on the road—will likely see the number of accidents rise as common consumers misinterpret Autopilot’s capabilities and confuse it for a fully autonomous system. By the way, Musk has repeatedly promised that’s coming within the year.

Last year, The Drive’s Alex Roy predicted the rise of the very type of incident I encountered, saying, “The more [automated] systems substitute for human input, the more human skills erode, and the more frequently a ‘failure’ and/or crash is attributed to the technology rather than human ignorance of it. Combine the toxic marriage of human ignorance and skill degradation with an increasing number of such systems on the road, and the number of crashes caused by this interplay is likely to remain constant—or even rise—even if their crash rate declines.”

message-editor%2F1559086075890-webp.net-resizeimage.png
Tesla

Roy ended his editorial with an ultimatum; either automated systems like Tesla’s Autopilot are regulated, “Or we can do nothing and suffer through the same clickbait and hand-wringing over and over until the next crash.” As of this writing, regulatory oversight on automated systems is largely left to those building and testing these technologies. And any sort of regulation is viewed by the decision-makers as limiting technological progress.

Elaine Chao, the United States’ Secretary of Transportation, sees these laws as a burden, telling a group of reporters last year that her office is working with autonomous tech companies to target regulations that are hindering progress. In some states, governing agencies have decreased regulation and increased incentives so as to attract members of this nascent and well-funded industry

As for the manufacturers themselves, Ford, Toyota, and General Motors have recently joined forces with SAE International to create a set of standards that’d give the public something more concrete with which to measure automated systems and their respective success and implementation. However potentially beneficial, this push—as well as those cases above—is still a matter of the patients running the asylum, as each of these organizations have a vested or financial interest in proving to the increasingly wary public the technology is safe and should be purchased. What we need are people who understand the technology, its current limitations, capabilities, and potential, and come up with common-sense regulations to ensure the public’s safety is goal number one. We’re not even close to anything of that sort. 

message-editor%2F1559086259031-webp.net-resizeimage-2019-05-17t170035.362.jpg
Tesla

Autonomous has become a buzzword for corporations to bolster profits, egos, and little else. Its use tells consumers that this manufacturer is looking to the future and aims to get people around easier and safer than ever before. It spins a tale of a tomorrow devoid of drivers and personal liability; a utopia. It’s a vision that’s been then parroted by everyone, from talking heads to reporters to futurists with larger than average social media followings to Silicon Valley’s “best and brightest.” What they’re really selling is the snake oil of our time. When these automated systems have been brought under scientific scrutiny, they’ve wilted in the light

The quest for progress has allowed companies like Tesla to endanger the lives of everyone on the road. We’ve given these companies space to hide behind the voluminous Terms and Conditions’ contracts that we’re programmed to accept automatically. It has to stop. Regulation must be mandated. These technologies need geofencing, parallel autonomation, driver-monitoring systems, rigorous virtual and real-world testing away from the public, and the same sort of laws that govern the rest of the automotive world.

It’s time to reign in these automated systems and pull them from the anarchy of regulatory-free public use. If we don’t, my daughter—as well as all of your children—could become just another anomaly in the path toward progress.