Deadly crash raises legal questions about Uber’s autonomous vehicle autopilot systems and privy liability.
A self-driving Uber car struck and killed a 49-year-old woman in Tempe, Arizona last week; this was the first reported fatal crash involving an autonomous automobile and a pedestrian in the US.
An inattentive Uber driver placed his Volvo XC90 SUV on autopilot just before the victim walked her bicycle across a dark, public highway. Though armed with lasers and radar designed to identify and avert pedestrians, the Uber’s navigation system declined to react in time to avoid the fatality.
The company abruptly halted testing of its autonomous vehicles across North America after the casualty, and Tempe police are investigating to work out who is at fault.
For almost a year now, Marc D. Grossman of Sanders Phillips Grossman has been reporting about the foreseeable harms autonomous automobiles may cause individuals and the legal consequences that should follow.
Our attorneys have studied the legal issues surrounding driverless car technology from every angle and have opened discussions that have touched on negligent personal injury considerations to more complex products liability questions surrounding unmanned automobile technology.
The devastating fatality in Arizona last week reinforces a legal question that our lawyers have presented for months: who’s responsible when autonomous vehicles kill or harm pedestrians?
Sanders Phillips Grossman now feels compelled to study this accident in detail and abstract the legal issues surrounding the self-driving Uber fatality in Tempe, Arizona.
Uber Driver
Safety driver, Rafaela Vasquez, was behind the wheel of an Uber self-driving Volvo when the fatal accident took place. Vasquez was not of diminished capacity at the time and was responsible for taking over if the Uber’s autonomous systems failed.
Uber is further respondeat superior for Vasquez because the company hired her as a safety driver for testing unmanned cars in Arizona and the accident occurred within the scope of her employment.
Vasquez placed the Uber car in self-driving mode to travel the legal speed limit, and video shows Vasquez was looking down at the vehicle’s autopilot panel when the Uber struck the pedestrian.
This evidence indicates that though the human backup driver carried the attentive capacity to take control of the vehicle, reliance on the self-driving vehicle’s autopilot system placed Vasquez in inattentive mode a second or two before impact.
Preliminary evidence suggests the collision was “unavoidable” and that Vasquez, “would likely not be at fault in this accident,” according to Tempe Police Chief, Sylvia Moir.
On a side note, it appears Vazquez was a convicted felon before starting work at Uber, which may raise questions later about Uber’s pre-employment driver screening.
Accident details
At the time of the impact, Uber’s vehicle was traveling in autonomous mode at 38-mph on a rural highway after midnight. Google Maps shows the speed limit at the accident site to be 45 mph; yet, Tempe Police claims the speed limit in the area was 35 mph.
Video shows forty-nine-year-old Elaine Herzberg appearing suddenly from the shadows in a dark and poorly lit area at Mill Avenue and Curry Road, crossing the streets outside of marked crosswalks.
All vehicle systems were engaged and functioning, and it’s not clear yet if autopilot hardware failed to detect Herzberg or if it recognized the victim’s presence but declined to react.
The Uber Volvo never slowed down and the safety driver could not take control of the vehicle before striking Herzberg, who was thrown to the side of the road and later died from her injuries at the hospital.
Uber Autonomous Autopilot Hardware and Software
Self-driving automobile manufacturers expressly warrant that an autonomous vehicle’s cameras, radars, and light detection-ranging sensors (LIDAR) can see beyond what normal human drivers can see.
This enhanced perception of the technology supposedly implies unmanned vehicle autopilot operation is safer than traditional human driving. However, independent marketing groups have discovered driverless autopilot hardware installed in most autonomous automobiles is inexpensive.
Now, whether automakers can install cheap hardware on unmanned vehicles for protecting human lives may be a legal question that the courts may clarify for us soon.
Uber’s Volvo XC90 SUV test vehicle embraced ultrasonic sensors and LIDAR to survey its surroundings and to measure the proximity of objects close to the vehicle.
Video cameras and ultrasonic radar sensors were supposed to check and detect Herzberg’s presence and deliver a signal to the XC90’s central computer to steer clear or break for the pedestrian.
Beyond usual crosswalk positions, the SUV’s GPS and odometry sensors further should have mapped routes where Arizona pedestrians and cyclist are prone to use and cross at night.
The Volvo XC90’s LIDAR uses sparse beams for monitoring pedestrian movement.
The cheaper technology may bring sticker costs way down, but at what cost?
Would an alternative design like Google’s autonomous autopilot 64-point LIDAR system have saved Herzberg’s life?
Whether Herzberg would be alive today if Uber installed expensive or alternative hardware systems may be a question for fact finders to discover in court; but Sanders Phillips Grossman can most certainly argue the electronic devices on Uber’s XC90, vital for averting pedestrian harm, completely failed to do their duties in Tempe last Sunday morning.
Equipment Failure
Herzberg walked her bicycle across the open Arizona road slowly on foot; she did not jump in front of Uber nor were there building structures around to interfere with LIDAR and radar detection equipment.
Yes, it was dark and the area of the accident was poorly lit like most urban streets are at midnight; but, Sanders Phillips Grossman asserts automakers have legal duties to design autonomous autopilot detection equipment to cut through severe weather and dark road conditions and find innocent pedestrians in harm’s way.
Self-driving hardware should also efficiently interact with navigation software and take forward-looking aversive action for protecting human life; or at least, autonomous automobiles should advise pedestrians of their oncoming presence via sirens, flashing lights or high beams as soon as they detect humans in their path at night.
Bryant Walker Smith, a legal authority on autonomous vehicles, certainly agrees with Sanders Phillips Grossman’s legal duty assertions after telling reporters on Wednesday, the self driving Uber, “absolutely should have been able to pick her up,” and video evidence shows “the car is at fault, not the pedestrian.”
Autonomous car researcher, Michael G. Bennett, further adds self-driving vehicle technology must detect pedestrians and avoid hitting them, regardless of where they cross streets because “Every day, pedestrians in cities around the world step outside of the crosswalk.”
Inattentive vs. Attentive Drivers
Before considering who may be at fault for Sunday’s tragic accident, it’s important to examine briefly the distinction among inattentive and attentive drivers of autonomous vehicles.
Attentive drivers are individuals who have the power to intervene when self-driving navigation systems fail. These operators may share fault when they foresee accidents approaching but fail to exercise ordinary care to escape collisions.
Autopilot misuse takes place when drivers affirmatively trust a vehicle’s navigation system to steer clear of danger.
Inattentive drivers engage their vehicle’s autopilot and attend to minor tasks while on the road. These operators trust self-driving warranties from automakers and their dealers who affirm autopilots can handle driving while performing foreseeable tasks on the road (making phone calls or reviewing email).
Inattentive drivers contrarily do not misuse self-driving systems if they temporarily take their eyes off the road while their cars run on autopilot, since this use is foreseeable.
Determining Fault
Tempe Police Chief Moir affirmed on Tuesday that authorities have yet to uncover who was at fault for the Uber pedestrian death. Yet, Moir is on record suggesting the accident may have been “unavoidable” and that the tragedy would have occurred even if the operator were human.
What we saw from video footage is Vasquez engaged the Uber’s autopilot for testing night driving, and she took her eyes off the road for a few seconds to monitor what looked like information on the autonomous car’s dashboard.
By definition, we can then classify Vasquez as an inattentive driver who relied on his autonomous software and hardware technology to operate the Uber and avoid pedestrian harm while she took test readings on the road.
In previous analyses, Sanders Phillips Grossman has examined autonomous vehicle fault in considerable detail from the standpoint of: attentive drivers, inattentive drivers, diminished capacity drivers and disabled drivers.
To make things simple here, we’ll examine fault liability in Uber, Vasquez and Herzberg’s actions leading to the accident.
Uber Safety Driver Liability
The Uber was engaged in high automation for dealing with all aspects of dynamic driving at the time of the accident. So, the law should pass no fault to Vasquez, unless she intentionally disrupted the autopilot system.
Vasquez gazed something inside the car about four or five seconds before impact, and cameras did not record her hand movements.
If evidence confirms the safety driver never obstructed or altered proper operation of the vehicle’s autopilot network, attorneys cannot claim Vasquez misused the technology or was comparatively negligent in causing damages.
In such event, complete liability should shift away from Vasquez and to Uber, the automaker and their hardware/software providers.
Uber Liability
It is possible automated sensors and radar detected Herzberg in the distance and elected not to respond based on the Uber’s automated programming. Then we may also consider the evident fact the self-driving hardware and software may have failed to spot the pedestrian from afar.
Companies who test driverless automobile technology on public roads and automakers who place autonomous cars in consumer markets hold legal duties to make sure their vehicle’s navigation programming is safe and works properly.
National Highway Traffic Safety Administration (NHTSA) policy further demands self-driving autopilots must detect objects moving to the left, right and ahead of vehicles and to yield these objects a “right of way” under all circumstances.
Hence, even if Herzberg was negligently crossing the street, the self-driving Uber should have found her and moved to avoid risk of harm. Instead, the automobile struck the victim head on with no evasive action or warning beforehand.
The accident therefore “speaks for itself” and liability should shift to Uber, Volvo and their software provider under products liability theory or via negligence per se.
Pedestrian Liability
Some may suggest that Herzberg was wholly or partially liable for her own injury because she walked across a poorly lit public roadway, in the middle of the night, outside of the crosswalk.
Here the question is did Herzberg act reasonably enough to escape her own death so that law does not find her at fault?
Sanders Phillips Grossman answers this question by considering Uber’s duty of care. As suggested above the law requires self-driving autopilots to be vigilant and alert while engaged, especially when operating in areas where pedestrians might be present.
Uber’s duty of care to Herzberg was to make sure their Volvo exercised the attention that a human driver would have used in the same situation.
Duties of care further multiply tenfold when technology providers claim an autopilot’s driving eyes and reaction maneuvering are more efficient than human behavior.
Careful people slow down on dark roads, hold their high beams on and honk horns when spotting pedestrians nearby.
State statutes also impute special duties of care on drivers regarding pedestrian activity were prudent driving supersedes other laws like the speed limit — i.e. when drivers find road conditions unfavorable, they hold duties of care to avoid accidents by say lowering their speed to 15 mph even when posted speed limits are 40mph.
Uber’s driverless technology never considered slowing down on the dark Arizona road, nor did it turn on its high beams on or honk when it detected an object in front of the vehicle. Therefore, the autonomous automobile did not act as a human driver would, and the courts should not hold Herzberg responsible for her own death.
Legal Ramifications on the Horizon
In the old days, accident victims would sue drivers and their insurance companies to redress damages from loss to property, life or limb.
The self-driving car industry changed legal trends as soon as their automobiles began causing harm on public streets.
While lawyers are free to claim company negligence in the wrongful death of Herzberg, Sanders Phillips Grossman strongly advocates the law should hold Uber, Volvo and their technology providers strictly liable for the accident.
In a wrongful death, products liability action brought by Herzberg’s estate, the plaintiffs would need only to prove Uber’s self-driving vehicle was defective and caused an actual and foreseeable death to win their lawsuit.
Negligence theory would require Herzberg’s estate to establish duty, breach of duty, and causation and damages, which is more complex and costly to prove and opens the controversy to additional affirmative defense arguments.
Products Liability
Autonomous automakers and self-driving technology entrepreneurs must exercise reasonable care in designing their autopilots for safe use. The law will further hold these companies strictly liable for damages when driverless systems fail, regardless of where fault lies.
Sanders Phillips Grossman asserts the courts should hold Uber and Volvo strictly liable for the wrongful death of Herzberg if evidence shows:
- The Uber’s autopilot and navigation system did not meet design specifications and standards.
- Uber’s use of an alternative self-driving autopilot design would have reduced Herzberg’s harm.
- Volvo or Uber failed in its duty to provide Vasquez instruction on how to operate her unmanned driving system safely or if the companies did not give the driver or the public adequate warning of autonomous driving risks.
Driver Negligence
The law holds attentive human drivers negligent when they breach reasonable duties of care in operating self-driving vehicles and cause accidents with foreseeable damages. As mentioned above, Vasquez inattentive driving at the time of the collision was most likely foreseeable. Therefore, if the Uber safety driver possessed no duty of care, the law cannot hold him negligent under tort theory.
Governor Allows Autonomous Automobile Testing in Arizona
Arizona Governor Doug Ducey signed a permissive executive order that allowed self-driving car testing on public roads in 2016. Two years later, a second order from Ducey permitted driverless car operation without humans behind the wheel on state roads, as long as “the vehicles follow all existing traffic laws and rules for cars and drivers.”
Arizona is now in a tech race with California to generate jobs in autonomous automobile markets within their states. Ducey’s acts allowed Uber and GM to assess over six hundred self-driving vehicles on Arizona public roads in 2018.
National Transportation Safety Board (NTSB) Reaction
The NTSB and other state agencies are investigating Herzberg’s death and Uber to find out why the driverless Volvo did not avoid a collision with the pedestrian.
The agency has been closely following incidents involving autonomous or vehicles since 2016 when a self-driving Tesla in Florida killed a driver after its autopilot system failed.
Uber Activity in Arizona and Company Statements
California revoked Uber self driving testing in San Francisco in 2017 after transportation officials found numerous autonomous vehicles running red lights and other driverless vehicles caught in minor fender bender accidents throughout the city.
This prompted Uber to bring its fleet to Arizona shortly after Governor Ducey signed his executive order. After Sunday night, Uber stopped self driving testing in all US and Canadian cities including Tempe, Pittsburgh, and Toronto.
Uber executives further expressed regret in public statements averring, “our thoughts continue to be with Elaine’s loved ones.”
Video — https://youtu.be/RASBcc4yOOo