The initial report of the National Transportation Safety Board on the fatal accident of self-driving in March confirms that the car detected the pedestrian as early as 6 seconds before the accident, but did not see it. did not idle or stop because his emergency braking systems were voluntarily deactivated.
Uber told the NTSB that "emergency braking maneuvers are not activated when the vehicle is under computer control, in order to reduce the risk of erratic vehicle behavior," he said. other terms to ensure a smooth ride. "The vehicle operator is summoned to intervene and act.The system is not designed to alert the operator." It is unclear why emergency braking capability exists even if it is turned off while the car is running Volvo's built-in safety systems – collision avoidance and emergency braking, among others – are also disabled in standalone mode.
It seems that, in an emergency situation like this, this "self – driving" is not better, or noticeably worse, than many normal cars already on the road.
It is difficult to understand the logic of this decision. An emergency situation is exactly the situation where the autonomous car, and not the driver, should take action. Its long-range sensors can accurately detect problems at much farther distances, while its 360-degree awareness and route planning allow it to make safe maneuvers that a human would not be able to do. to do on time. Humans, even when all their attention is on the road, are not the best at catching these things; relying on them only in the most difficult circumstances that require quick response times and precise maneuvers seems an incomprehensible and deeply irresponsible decision.
According to the NTSB report, the vehicle first recorded Elaine Herzberg on lidar 6 seconds before the accident – at the speed he was traveling, which puts the first contact to about 378 feet away. She was first identified as an unknown object, then a vehicle, then a bicycle, in the next few seconds (it is not specified when these classifications took place exactly).
During these 6 seconds, the driver could and should have been alerted of an abnormal object in front left – whether it be a deer, a car or a bicycle, he was entering or could enter the road and should be attended. But the system did not warn the driver and apparently had no way to do it.
1.3 seconds before the impact, about 80 feet away, the Uber system decided that an emergency braking procedure would be needed to avoid Herzberg. But he did not brake, because the emergency braking system had been disabled, and he did not warn the driver because, again, he could not do it.
It's only when, less than a second before the impact, the driver looked up from what she was doing, and she saw Herzberg, that the car had known for 5 seconds. He hit her and killed her.
This reflects extremely poorly on Uber that it had disabled the car 's ability to respond in case of emergency – even though it was allowed to speed up the night – and no method allowing the system to operate. alert the driver when he detected something important. It's not just a safety issue, like going on the road with an underdog lidar system or without checking the headlights – it's a lack of judgment by Uber, and one that has cost the life to a person.
Arizona, where the accident occurred, prevented Uber from further independent tests, and Uber yesterday completed its program in the state.
Uber made the following statement on the report:
In the past two months, we have worked closely with the NTSB. As their investigation continues, we have undertaken our own safety review of our autonomous vehicle program. We also invited Christopher Hart, former NTSB President, to advise us on our overall safety culture, and we look forward to sharing more about the changes we will make in the coming weeks.