Video Raises More Questions than Answers in Self-Driving Vehicle Death
The news media has been abuzz since a self-driving Uber SUV struck and killed a pedestrian in Tempe, Arizona, last weekend. While it is not the first accident or fatality involving a self-driving vehicle, it is the first to result in the death of a pedestrian.
Two years ago, the driver of a Tesla in Autopilot mode was killed when his vehicle crashed into a semi-truck that turned in front of him. An investigation by the National Transportation Safety Board (NTSB) did not fault the manufacturer’s Autopilot system. In an incident earlier this year, a Tesla in Autopilot mode crashed into a parked fire truck in Culver City, California, but fortunately, no one was injured.
Immediately following the tragedy, Uber temporarily suspended its driverless vehicle testing program, and the world eagerly awaits the conclusions of several investigations.
Police Release Dashcam Video, Ignite More Debate
The accident has also rekindled the debate about the safety of autonomous vehicles, a concern that has always surrounded the fledgling technology. A graphic video of the incident released by the police on Wednesday, March 22nd, has intensified this debate and raised more questions than it has answered.
Two camera angles are shown. The first looks through the windshield from the inside of the vehicle, essentially what would be the driver’s point of view (or POV). The second records the driverless vehicle’s operator.
From the first POV, we see that the car is traveling down an unlit roadway at a decent rate of speed (reportedly about 40 mph). There are no streetlights where the vehicle is traveling, though we do see some several hundred yards in front of the vehicle. Out of nowhere, a figure walking a bike across the roadway appears in the headlights, and it seems there are only about three frames of video before the clip stops—just before impact.
The second POV focuses on the driver (test cars must have drivers to intervene if needed). The driver appears to be looking down and to the right at the beginning of the clip, then looks up just in time to react with horror at the moment of impact.
From the video, it appears that no human driver would have been able to react in time to avoid the accident, either by braking or by swerving. In fact, the San Francisco Chronicle has quoted Tempe Police Chief Sylvia Moir as saying, “It’s very clear it would have been difficult to avoid this collision in any kind of mode [autonomous or human-driven] based on how she came from the shadows right into the roadway…. I suspect preliminarily it appears that the Uber would likely not be at fault in this accident, either.”
But experts on autonomous vehicles disagree. Alain Kornhauser, head of the autonomous driving program at Princeton University, said, “What we now need is for the release of the radar and lidar (sensing technology that utilizes light from laser data). The front-facing video suggests that this person was crossing the lane at a slow speed and should have been noticed by the system in time to at least apply the brakes, if not stop the vehicle completely. While a human may not have been able to avoid this crash, a well-designed, well-working collision avoidance system should have at least begun to apply the brakes.”
A law professor and driverless specialist at the University of South Carolina, Bryant Walker Smith, said, “Although this appalling video isn’t the full picture, it strongly suggests a failure by Uber’s automated driving system and a lack of due care by Uber’s driver as well as by the victim.”
Kornhauser added, “Obviously, the video of the driver is extremely bad for Uber and probably implies that Uber should suspend all of its ‘self-driving’ efforts for a while if not for a very long while.”
Since the main argument in the campaign for driverless technology has always been that autonomously-operated vehicles would actually be safer than those operated by humans, the counter-argument that a human driver wouldn’t have been able to avoid hitting the pedestrian in this case is not really valid.
How Driverless Liability Works
If the deceased woman’s family were to take civil action in response to this incident, several parties could be held liable. Uber, of course, would be the prime defendant. First, it was an Uber vehicle, using Uber-developed technology, and an Uber-hired driver. The driver, now revealed to have previously served nearly four years in prison for armed robbery and other felonies, was hired by Uber, whose background checks of employees has come into question in the past.
The driver herself, if she had been properly hired and trained, may share some liability for the accident if it was found that she acted negligently.
Finally, Arizona could share some liability for the pedestrian’s death because the state has aggressively embraced driverless vehicle testing. In 2015, Governor Doug Ducey signed an executive order allowing the testing of driverless vehicles on Arizona’s roads with few to no regulations. The decision to sign this order came after California regulators shut down Uber’s driverless vehicle testing program after video footage surfaced showing an Uber prototype running a red light in front of a pedestrian.
If you are ever injured or lose a loved one to a driverless vehicle, it’s important to have the counsel of an experienced car accident attorney. Legal issues involving emerging technologies can be very complicated because there is no established legal precedence pertaining to them. But a skilled attorney can research previous judicial decisions, and argue that they comparatively apply to your unique situation. Such cases would require extensive and time-consuming research, but they could be done.
It appears that self-driving vehicles are bound to be part of Arizona’s future, but we at Breyer Law Offices, P.C., sincerely hope that the powers that be ensure that we reach the future safely. Advances in technology are a good thing, but they shouldn’t come at the risk of injury or death to innocent human beings.