By Shelia Dunn, NMA Communications Director
For years, an ongoing debate has been raging in the media about who would be responsible when a driverless vehicle injured or killed another road user. Perhaps, we will soon have the answer.
In March 2018, Elaine Herzberg jaywalked from a center median with her bicycle across a dark street in Tempe, Arizona, and was hit by a driverless car. This tragic accident was the first US death involving another road user and an autonomous vehicle.
Dashboard video was taken of the street and inside the car at the time of the accident. The safety driver on board, Rafaela Vasquez, was distracted and unable to avert from the accident at the last minute. In autonomous mode, the Uber traveled 38 mph in a 35 mph zone and did not attempt to brake.
The Tempe police chief stated at the time that the autonomous Uber was not at fault in Herzberg’s death. Chief Sylvia Moir told the San Francisco Chronicle after reviewing the video, “It’s very clear it would have been difficult to avoid this collision in any kind of mode based on how she (Herzberg) came from the shadows right into the highway.”
Local police and both the National Transportation Safety Board (NTSB) and the National Highway Traffic Safety Administration (NHTSA) investigated the accident. The state of Arizona immediately suspended permission for Uber to continue testing driverless vehicles in the state. Uber reacted by pulling all of its related programs from other states until partial reinstatement several months later. Soon after the accident, Uber also settled for an undisclosed amount with the Herzberg family.
In March 2019, Yavapai County Judge Sheila Sullivan Polk declared that Uber had no criminal liability in the death of Elaine Herzberg.
In November of that same year, the NTSB concluded its investigation with a 400-page document that stated that neither Uber, the state of Arizona nor the car’s operator were vigilant. The report also concluded that Uber’s driverless car technology revealed a cascade of poor design decisions that led the Volvo XC90 SUV to process improperly and respond to Herzberg’s presence as she crossed the roadway. The vehicle’s radar did not detect Herzberg until about six seconds before the accident. The system could also not classify an object as a pedestrian walking a bicycle unless they were near a crosswalk. Herzberg was 100 feet from the nearest crosswalk.
The Volvo’s automatic braking system had also been disabled because its electronics interfered with Uber’s self-driving sensors.
The NTSB report also criticized the National Highway Traffic Safety Administration (NHTSA), the federal road safety agency, for not regulating driverless car tests on public roads. One of the NTSB members stated, “There’s no requirement. There’s no evaluation. There are no real standards issued.”
NHTSA has avoided autonomous driving regulations and instead issued voluntary guidelines, including safety assessment reports. As of this week, more than 2-1/2 years later, NHTSA has not yet issued its report on the accident.
Fast forward to September 2020—a Maricopa County grand jury has charged the Uber safety driver Rafaela Vasquez with negligent homicide in the death of Elaine Herzberg. Her employer, Uber, and the company that built the automated system involved in the fatal crash won’t be charged.
Before the accident, Vasquez (a backup driver waiting for the primary autonomous driving system to alert her when an intervention was needed) was watching her phone instead of the road. Ryan Calo, University of Washington School of Law Professor (who studies robotics), stated to Wired Magazine in a recent interview, “That’s a simple story, that her negligence was the cause of (Herzberg’s) death. Bring a case against the company, and you have to tell a more complicated story about how driverless cars work and what Uber did wrong.”
Even if Vasquez had been watching the road, could she have reacted in time? For decades, research has shown it is challenging for humans to keep attention focused on partially automated tasks. That is why up until late 2017, Uber used two safety drivers to solve the “automation complacency” problem. However, the company went to a one driver requirement a few months before the tragic accident, presumably to save money.
Also, weeks before Herzberg was killed, an Uber whistleblower named Robbie Miller warned company executives in an email about safety issues. He told top officials, “A car was damaged nearly every other day in February (2018). We shouldn’t be hitting things every 15,000 miles.” He also added, “Several of the drivers appear not to have been properly vetted or trained.”
As we slowly gravitate to autonomous levels 3, 4, and 5 (no human intervention), the responsibility factor will become even more acute.
The trial of former Uber safety driver Rafaela Vasquez, the least-deep-pocket target on the liability list, begins soon.