Software in Self-Driving Uber Didn’t Recognize Jaywalkers
An Uber self-driving car that struck and killed a woman last year in Arizona failed to recognize her as a pedestrian because she was jaywalking, US transport regulators said Tuesday.
The woman had been crossing the street “at a location without a crosswalk; the system design did not include consideration for jaywalking pedestrians,” the US National Transportation Safety Board (NTSB) said in a statement.
In a preliminary report, the NTSB had already determined that the car’s software spotted the 49-year-old woman nearly six seconds before the vehicle hit her, as she walked across the street at night with her bicycle in Tempe, a suburb of Phoenix.
The self-driving Uber test car that struck and killed a pedestrian last year wasn’t programmed to recognize and react to jaywalkers https://t.co/7AHWvNKBVd
— Bloomberg (@business) November 5, 2019
According to the latest report, which was issued ahead of a November 19 hearing to officially determine the accident’s cause, the system at no time “classified her as a pedestrian” but rather, considered her an object.
When the software determined that a collision was imminent approximately 1.2 seconds before impact, it suppressed any “extreme braking or steering actions” to reduce the potential for erratic vehicle behavior.
It did, however, produce “an auditory alert to the vehicle operator as it initiated a plan for the vehicle slow down.”
Following the March 2018 accident, Uber suspended its autonomous driving testing in all locations in the United States but resumed the program several months later.
The company has assured the NTSB that new technology in the cars will correctly recognize pedestrians in similar situations and trigger braking more than four seconds before impact.
According to the report, 37 crashes involving Uber automatic test vehicles operating in autonomous mode occurred between September 2016 and March 2018, excluding the Arizona crash.
Inquirer.net will receive a commission on purchases made*