At the annual Automated Vehicles Symposium on Tuesday, speakers in San Francisco emphasized that the decisions of human beings—as regulators, executives and consumers—are going to determine how bright, or how dark, the future of self-driving cars will be.
If regulators make too many laws now, warned the head of policy from X (formerly GoogleX), “the cold, dry text” will stymie the most cutting edge technology. If business leaders aren’t uber-transparent about why they believe their self-driving vehicles are safe, cautioned a law professor, the crucial element of public opinion won’t be on their side. And if we’re not realistic about human habits, predicted a Berkeley transportation expert, automated vehicles could lead us to waste more gas and cause more congestion instead of leading to a traffic and accident-free utopia on the road.
That Berkeley expert, professor Joan Walker, highlighted the example of “zero-occupancy” vehicles that could be clogging up residential streets as people summon them for on-demand home deliveries. One of the biggest problems in America’s transportation system is that the vast majority of trips are taken by people who are a driving an otherwise empty car, which sits dormant in their driveway 95% of the time. Many experts have touted the onset of self-driving cars is a chance to rethink that relationship and share fewer assets, easing traffic and lessening the need for parking spots. But Walker argued that people might also call for a self-driving car with all the environmental consideration they give to ordering their fifth package of the week on Amazon Prime.
“To get to the future reality that we want will take not only automation,” said Walker, “but also requires behavior change.”
Conversations about self-driving cars have a tendency to drift into scintillating hypotheticals, whether someone is imagining just how many lives could be saved on the road if we took human error out of the equation or mulling the ethics of an autonomous sedan deciding whether to save the life of a rider or several pedestrians on a tram. But the speeches at the conference on Tuesday, which bills itself as “the largest gathering … of professionals involved with making automated vehicles a reality,” often concentrated on more mundane, more immediate ones.
Is it ethical to beta test self-driving cars on the road when most of the people using that road haven’t given their consent? How much advertising will be allowed in a self-driving car or along the route that car is programmed to take? As more cars get semi-autonomous features—like self-parking or self-braking—who is going to take responsibility for explaining to drivers how they work? In the coming years, as those features become more robust, how do we train humans to stay engaged instead of literally going to sleep at the wheel?
Many speakers pointed to the past and present to make arguments about how humans should be approaching the onset of self-driving cars, but with different aims. X’s head of policy, Sarah Hunter, argued that laws have tended to come after innovation in transportation, suggesting that too many lawmakers have been authoring statutes rather than offering up their states as “test beds.” Others pointed to our current relationship with private vehicles, with all its inefficiency, as a tale of how things could go wrong or lead to disparities in society. And others, particularly J.D. Power and Associates’ Kristin Kolodge, made the case that there isn’t enough attention being paid to the building blocks of autonomous cars that are live right now.
Kolodge presented research showing that drivers have resorted to “trial and error” in trying to figure out how to work features like adaptive cruise control (an increasingly common feature that uses sensors to keep the car at a set distance from the one in front of it, rather than at a set speed). Most of the people surveyed by J.D. Power said the dealer staff didn’t show them how such features worked. About half said their blind-spot monitors malfunctioned sometimes. Kolodge warned that the comfort and trust people have with these “assist” features will help determine how and when fully automated cars can “achieve serious volume.”
Trust may have been top of mind for speakers in part because of the recent death of a Tesla driver; the role that autonomous technology may have played in the accident is under investigation by the federal government. Multiple speakers mentioned the incident, including Kolodge, as they emphasized how important the attitudes of human beings will be as new technology continues to roll out. “It’s extremely unfortunate the Tesla incident that happened,” she said. “This element of trust is extremely fragile.”
Among those who came to address the audience with a message of prudence was U.S. Transportation Secretary Anthony Foxx. “We don’t want to replace crashes with human factors with large numbers of crashes caused by systems,” Foxx said. But “ready or not,” he added, “autonomous vehicles are coming.”
More Must-Reads from TIME
- Where Trump 2.0 Will Differ From 1.0
- How Elon Musk Became a Kingmaker
- The Power—And Limits—of Peer Support
- The 100 Must-Read Books of 2024
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com