Who killed Elaine Herzberg? Not the driver of the car that ran her over — because there was no driver. And therein lies a problem.
Arizona prosecutors’ decision to not criminally charge Uber for the March self-driving-car death of Herzberg signals that tech companies won’t be punished for taking egregious risks with their untested technology even when the worst happens — in this case, the crash death of Herzberg, a homeless woman in Tempe who became the first person killed by a self-driving car.
Uber has already settled a civil case with Herzberg’s family. And the National Transportation Safety Board has yet to release its full findings of an ongoing investigation. And local authorities say that the car’s “backup driver” may still be charged with vehicular manslaughter because she was watching “The Voice” on her phone when the car hit Herzberg. She did not hit the brakes until after the collision.
But Uber isn’t blameless; a preliminary report by NTSB found that Uber had deactivated the car’s emergency braking system. And that decision comes down to money. Self-driving cars can be programmed to brake whenever there is an object that the computer system can’t identify, which in tech jargon is called an “edge case.” But programming the car that way can make the journey jerky and nauseating. Uber was in a rush to start its self-driving taxi service that summer, so it had programmed the car to take chances.