How PAVE Can Make A Difference in Autonomous Car Education
The new safety group coalition needs to step up to some tough challenges.
Some of the biggest mobility news at this year's (boring) Consumer Electronics Show was the launch of Partners for Automated Vehicle Education, a coalition of autonomous vehicle industry players and nonprofits aimed at improving the public's understanding of automated vehicles. Some two years in the works, the formation of PAVE by some of the space's heaviest hitters shows that the era of self-driving hype may finally be coming to a close. But, as Jack Stilgoe points out at DriverlessFutures (and the gadfly vanguard of #AVTwitter has been discussing in the days since the announcement), PAVE can't simply show up, talk about the safety benefits of AVs, and then hope to make the impact they are looking for. Understanding the source of the confusion and misinformation that they hope to address, and finding the initiative to directly confront it, will be very real challenges.
Stilgoe argues that there's a potential parallel to PAVE: "In the 1980s," he writes,
"scientists and innovators recognised a growing public scepticism. Their prescription was one of education. In the UK, we saw institutionalised programmes to improve the ‘public understanding of science’. These programmes were based on what Brian Wynne later called the ‘Deficit Model’. The problem was seen as one of public ignorance, which could be corrected with more public information. It was the wrong diagnosis. As controversies grew around new technologies like genetically modified crops, it quickly became clear that the problem was not a deficient public, but institutions of science and innovation that didn’t really understand what the public thought. Members of the public had a range of legitimate questions about the technology and these questions weren’t being listened too. The things that scientists and the biotech industry had decided were ‘the facts’ about GM crops were not the only facts that members of the public were interested in."
Stilgoe is right, in that a number of participants at PAVE's CES press conference did seem more interested in talking up their companies or the positive potential of AVs than in understanding or explaining the root of the "confusion" that the organization was set up to address. If PACE never evolves past that point it will quickly become irrelevant, another "big tent" organization that will disappear into a soft mist of unopposable but irrelevant platitudes. The problem, in a nutshell, is not that people don't understand how AVs may someday benefit society, but that too much focus on long-term benefits has created profound confusion about what automated vehicles are and are not right now. As Stilgoe correctly concludes, "if I were PAVE, I would be looking not to educate but to listen. There is an urgent need for dialogue."
The good news is that PAVE is listening. At least they listened to myself and several other members of the media covering automated vehicles at a breakfast they put on the day after the press conference announcing the coalition's launch. After sharing some insight into their goals, structure, and strategy, they encouraged me and other members of the media to share our thoughts about how to address public misperceptions of AVs and work toward a more fact-based conversation about the technology. A number of valuable perspectives and insights were shared by attendees, and the PAVE members who were present genuinely seemed to take them to heart. Given that both the media and the AV industry have played a role in the spread of misinformation about autonomous drive technology, the exchange felt almost like a group therapy session.
One of the points that I shared during that breakfast meeting was fairly simple: the last few years have seen a number of "teachable moments," whether in the fatal Uber or Tesla Autopilot crashes, misleading public statements by companies and even the Secretary of Transportation that have failed to produce clear lessons for the public. If PAVE wants to effectively tackle the problem, I argued, it needs to be able to recognize these moments and provide clear and authoritative facts about them as swiftly as possible. The fact that no authoritative body publicly corrected the Secretary of Transportation when she said that Level 2 automated vehicles have "no need for any person to be seated and controlling any of the instruments" is a big part of the problem. The fact that very few people understand how Tesla's design decisions around driver monitoring and operational design domain contributed to the fatal Josh Brown crash in Florida is a big part of the problem. My concern, I said, is that broad coalitions like PAVE often struggle to promptly build consensus and speak decisively about dangerous decisions and statements made by companies in competition with their members and regulators who oversee their members.
As it turned out, the very next morning provided an opportunity to test that concern. Over my morning coffee, I opened Twitter and found that Electrek had written a widely-shared story reporting that Tesla salespeople are warning customers that the "legal aspect" of the company's "full self-driving" option "is very far away." Tesla CEO Elon Musk tweeted later in the day that a driverless "follow" feature was "going through final validation & regulatory approval," and that it was "getting some regulatory pushback. May not be available in all regions." Now Tesla is the most controversy-laden company in the automotive and mobility technology spaces, and offering its "full self-driving" option may well be the most controversial thing it's ever done, but this specific story is built on a simple misunderstanding that is not in any way controversial: there is no regulatory approval process in the United States for a system like Tesla's "full self-driving" option. By blaming potential delays on a regulatory approval process that doesn't exist, Tesla is actively harming the public's understanding of automated drive systems and the regulatory framework (or lack thereof) around them.
Before I finished my first cup of coffee, I emailed the link to a PAVE contact and pointed out that this was exactly the kind of "teachable moment" the organization should be jumping on. Would PAVE be able to provide me with a brief statement, I asked, not commenting on Tesla specifically but simply laying out what (if any) regulatory approval process exists for automated drive systems on vehicles that are already compliant with Federal Motor Vehicle Safety Standards. Just a simple, factual statement about the state of federal regulation of automated drive systems that I could then apply to what I saw as the confusion that Tesla was contributing to. Easy, right?
After a prompt response promising that my request would be brought up to the appropriate people at PAVE, I heard... nothing. A week later, I still don't have any kind of statement from the organization. I say this not to bash on PAVE more broadly, let alone my contact there, as I have no idea what has held up the request. There are all kinds of reasonable explanations, including that the organization is simply recovering from its big coming out party. But the fact remains that this was precisely the concern that I voiced to the organization at our breakfast meeting, and in the week since, my concern has (rightly or wrongly) only grown.
There is a happy ending to this story. A week after my request to PAVE, I reached out to my friend Amitai Bin-Nun, the Vice President for autonomous vehicles and mobility innovation at Securing America's Future Energy, which is one of the member organizations that make up PAVE. He quickly and thoroughly confirmed what I had understood to be true: there is no formal federal regulatory approval process for a software-only autonomous drive system on an FMVSS-approved vehicle. Here is his full quote:
"The federal government has not yet updated its safety standards for autonomous vehicles. While this makes it difficult to deploy autonomous vehicles with unconventional designs without a steering wheel or brake pedals, official policy guidance has repeatedly clarified that federal regulations do not pose a legal barrier to the development, testing, sale or use of autonomous vehicles with conventional designs. Various states have imposed their own requirements on autonomous vehicles, but as long they are met, deployment is permissible in the vast majority of the country."
In other words, the idea that "regulatory approval" or "regulatory pushback" could delay Tesla's deployment of "full self driving" software in the United States is absurd (though in fairness, the Electrek story quotes a Tesla sales advisor as saying the regulatory issues exist "especially in Europe, the USA might be closer to get it legalized"). There simply is no such process at the federal level. Note that Bin-Nun didn't have to mention Tesla once, or editorialize about Tesla's "full self driving" option in order to clarify a very basic but important point about automated vehicle regulation in the U.S. His statement proves that it is possible for organizations to capitalize on "teachable moments" like this one, and turn misinformation into an educational opportunity.
I air all this publicly not to attack PAVE or suggest that it can't be useful, but because I believe it has the potential to do so much necessary good. PAVE and its members have immense authority on these issues and can get the attention of far bigger media outlets and audiences than the one you are reading now, but only if they have the resolve and the resources to jump on these "teachable moments." Capitalizing on issues of the moment isn't easy, especially for large coalitions, but it's only going to become more necessary as far more important moments than this one arise in the future. PAVE has committed itself to a challenge that is bigger and more complex than it can seem from the outside and they're just getting started, so this isn't about blaming them for anything. This is about making sure that an informational vacuum in our public discourse about an important and challenging topic is filled by a group that is committed solely to factual education, not companies with one eye on their stock price.
Knowing some of the people behind and inside PAVE, I have every confidence that they will get there. If nothing else, Amitai Bin-Nun and SAFE are already showing the way. As for myself, I will keep doing everything I can to help them live up to the enormous challenge they have taken on.
MORE TO READ
Our First 3D-Printed Ford Maverick Accessories Are Taking Shape
We’ve got an under-seat cooler, a solar-powered heat extractor, and progress on a gyroscopic water bowl.
Next-Gen BMW 5 Series to Get Level 3 Autonomous Driving in Europe: Report
Level 3 autonomous driving is defined by the SAE as driving without input from a human driver—hands off, eyes off—in limited conditions.