Westend61/Getty Images
By Mark Geistfeld
March 30, 2018
IDEAS
Geistfeld is Sheila Lubetsky Birbaum Professor of Civil Litigation at New York University School of Law.

There is a paradox central to the ongoing ascension of autonomous vehicles. AVs — also called driverless cars — may one day make the world a safer place: by eliminating the human driver and abiding safe operational standards, these machines can substantially reduce the number of vehicular crashes, which resulted in an estimated 40,000 traffic deaths domestically in 2017. But in order to reach this level of performance, driverless cars will first kill some of us. In fact, to some degree, they must be allowed to do so — ideally, within limits.

Such was the intent of a bill passed by the House of Representatives last fall that established a regulatory framework for facilitating the safe development of these technologies, which have immense commercial potential. But those objectives are now being subverted by legislation pending in the Senate. This new bill, titled The American Vision For Safer Transportation Through Advancement of Revolutionary Technologies Act (AV START Act), eliminates all legal rights that individuals would otherwise have against driverless car manufacturers and commercial operators during the period when AVs are being tested but are not yet governed by federal motor vehicle safety standards — which is how life likely will be for at least the next 3–5 years. During that time, the Senate bill would effectively turn people using public roads into guinea pigs.

Like experimental drugs and medical treatments, driverless cars require extensive testing that can cause bodily injury and premature death. To develop the programming that enables these vehicles to safely interact with other vehicles and pedestrians, AV testers require other drivers to be on the street so that the vehicles can learn how to avoid crashes. But they will not always succeed. Driverless cars will crash, and people will die. In effect, the manufacturers and commercial operators of these vehicles must experiment with us in order to learn how to program AVs.

Our legal system recognizes that individuals can be used for experimental purposes under certain conditions. Some individuals, for example, take experimental drugs because the potential therapeutic benefit is the best if not the only hope. Others receive money for agreeing to participate in such an experiment. When these types of consensual exchange are made on a sufficiently informed basis, the individual is a willing participant in the scientific test (unlike a guinea pig, who has no recourse against the experimenter).

Driverless cars have so far been involved in two known fatal crashes, which together illustrate an essential difference of who assumes the risk when using this technology.

The first occurred in May of 2016 on a Florida highway. While Tesla enthusiast Joshua Brown was operating his Model S in its “Autopilot” mode, he fatally collided with a semitrailer truck crossing the road. He reportedly was watching a movie at the time of the crash, and the car apparently had been warning Brown to disengage the Autopilot mode and take over active driving responsibilities moments before he crashed. Like individuals who agree to face experimental medical risks in exchange for a potential therapeutic benefit, Brown chose to engage the Autopilot mode in exchange for the rewards and the excitement afforded to him by this experimental technology. As long as he made that choice on a sufficiently informed basis, he was not an unwilling guinea pig forced to help Tesla develop its Autopilot mode. (The instance is also unrelated to the recent Model S recall.)

This was not the case on March 18, when a driverless car operated by Uber in Arizona killed Elaine Herzberg because it failed to detect that she was crossing the road outside of the pedestrian crosswalk. Unlike Brown, Herzberg did not consent to participate in the AV testing. If Uber’s vehicle was defective or its testing program was unreasonably dangerous in any respect that caused the crash, Uber is legally responsible for her wrongful death. This type of tort liability protects individuals from harms caused by nonconsensual risky interactions. Herzberg — and now, tragically, her family — has a right of recourse against Uber, and the parties have just reached a settlement of those claims for an undisclosed amount.

In sharp contrast, the pending Senate bill would eliminate all forms of tort and related civil liabilities for physical harm caused by the testing of driverless cars, at least until a new regulatory framework is in place And this is not likely to happen for at least 3–5 years.

Absent federal intervention, the states can apply their own laws — though Congress prefers uniform regulation nationwide. After watching a video of the Uber AV running over Herzberg, Arizona Governor Doug Ducey ordered Uber to suspend its testing operations in the state; the company says it has done so in all cities. (The crash has also caused some Senators to question whether the pending bill goes too far.) Regardless, even if the government must limit manufacturers’ liability during this phase in order to foster the development of this life-saving technology, we do not have to be forced to participate in this experiment without any right of compensation for injury.

Congress addressed a similar set of policy issues in The National Childhood Vaccine Injury Act of 1986. Congress heard testimony that childhood vaccinations are effectively obligatory, and so those who are injured as a result of complying with this obligation deserve compensation. Manufacturers, in turn, complained that litigation expenses and premiums for liability insurance were dwarfing revenues and disrupting the commercial supply of vaccines. As then–First Circuit Judge Stephen Breyer explained, “The Vaccine Act responds to these complaints.” It guarantees scheduled amounts of compensation for vaccine injuries, funded by a tax imposed on vaccines. In exchange for such compensation, vaccine victims waive their tort rights. The streamlined process reduces litigation costs and insurance premiums for manufacturers.

While commuting in a world with autonomous vehicles on the road is a similarly worthy goal as vaccinating children, it is also just as inescapable. (On March 27, Alphabet’s Waymo announced it plans to put thousands of driverless cars on public roads in the next two years.) Congress should therefore require AV testers to compensate the individuals who are injured by their forced participation in AV testing, at least to some degree. We’ve proven this before. We can encourage the development of a life-saving technology without devaluing the wellbeing of the people risking their lives to help make that better world a reality.

Contact us at editors@time.com.

SPONSORED FINANCIAL CONTENT

You May Like

EDIT POST