A tragic accident this week occurred when a Volvo XC90 modified by Uber to drive autonomously struck and killed a middle-aged woman slowly walking a bicycle across the street. Despite prior reports that indicated the woman appeared "suddenly," she did not. She was in the roadway already as the Uber vehicle approached coming from the left travel lane into the right, in which the Uber vehicle was traveling. If you have not yet seen it, >you can watch dashcam footage of the accident. Note that the actual crash is not shown, but the video gives viewers a clear picture of what took place.
This exact scenario is one which Volvos can foresee and take action to prevent. Volvo's XC90 has technology that can detect pedestrians walking, detect bicyclists, and even detect non-human shapes like moose. As Volvo's video below shows, accidents like this are now preventable not just by having an attentive driver, and the one inside the Uber vehicle acting as a sort of backup, certainly appears not to be attentive, but by using active safety systems with automatic braking.
City Safety, Pedestrian Detection, and other names are given to these systems and they are not just available in pricey Volvos, but in mainstream cars like affordable Mazdas.
Uber is going to have a very hard time explaining why its self-driven vehicles run over slow-moving pedestrians given that unmodified Volvos are designed not to.
Comments
Anyone assuming these systems
Permalink
Anyone assuming these systems are designed to avoid accidents and are 100% capable better open their eyes. Driving is not a predictable and repeatable event. All these systems have a very long way to go.
Governments love autonomous driving. They can finally track your every move, tax you on miles driven, and have total control with outdated speed limits. Be careful what you wish for.
Both Volvo and Uber are directly responsible for this death.
Society today has been
Permalink
Society today has been conditioned to bow at the alter of computers. After all, computers never make mistakes, never crash, or get hacked and are perfect in every way. Computers, like automobiles, are made up of independent parts, that as a whole, make the final product. If one of these parts is defective in design or workmanship the product is recalled. Takata air bag recall comes to mind as a defective product. Occasionally automakers recall vehicles to upgrade the computer programs when issues with programming are discovered.
So when a pedestrian is killed by a self-driving car, well it can't be the car's fault....can it? My feeling is that there was a rush to vindicate the car from the start of the investigation, evidence be damned! I mean, we just can't have a chink in the armor of self-driving technology.
Technology is a wonderful thing. But it is not infallible. It's tragic that someone had to die to make this point.