What Can We Learn From the Self-Driving Uber Accident?

Share on social:

NEW INFORMATION ABOUT THIS CRASH EXISTS: READ THE UPDATE TO THIS ARTICLE

There have been other fatal crashes and serious injuries involving semi-autonomous cars in the United States, but the first fatal self-driving car crash occurred on March 18, 2018. As we prepare for a world with driverless cars that have neither steering wheels nor brakes, we should all understand how the technology may impact our safety.

Why Does the Arizona Uber Crash Matter in San Antonio?

Waymo and other autonomous vehicle companies are considering San Antonio as a test site for self-driving vehicles. We already have semi-autonomous cars on the road (such as Teslas and some Cadillacs). As a result, San Antonio can expect to see more accidents involving these vehicles, especially in the early days of their development.

RELATED ARTICLE: Self-Driving Cars Are Already on Texas Roads. What Should You Do in a Crash?

Some lawyers seem to be shrugging their shoulders about the fatal crash in Arizona. They think autonomous vehicle accidents are inevitable and that we shouldnā€™t make too much of the Arizona case.

But no one should die just so we can continue developing safer vehicles. And every crash victim deserves justice for their lossesā€”no matter who is at fault: a reckless driver, a defective sensor, a faulty algorithm, or a decision-maker at a company. At Crosley Law Firm, we are closely watching the details of the Uber case and others because we are prepared to represent self-driving car crash victims in Texas.

And autonomous vehicle cases will be much more complex than a traditional claim against a driver. In addition to the driverā€™s behavior, personal injury lawyers will have to understand and carefully investigate each manufacturerā€™s hardware and software systems. We will need to look for design and manufacturing defects that might have caused the collision or made it worse. And if your personal injury lawyer isnā€™t already preparing for this new reality, youā€™ll be at a serious disadvantage if youā€™re in a crash.

Below, weā€™ll take a closer look at the known details of the fatal Arizona Uber crash to show just how complicated a self-driving crash lawsuit could be.

Uberā€™s Testing Systems and Self-Driving Technology Raised a lot of Red Flags

While thereā€™s still a lot of unknown information about the Arizona Uber crash, we do have some basic information. Uber began testing its autonomous vehicles in Arizona in 2016. Before testing in Arizona, California refused to permit testing operations, and there were reports that Uberā€™s vehicles repeatedly ran red lights and could not identify bike lanes while operating in San Francisco. While California strictly regulates autonomous vehicle testing and demands driver intervention data, Arizona is much less strict.

Early on, Uber also had had two people riding in each self-driving car. One individual, called a ā€œsafety driver,ā€ sat in the driverā€™s seat and was responsible for taking control of the car in case of an emergency. The other individual helped with data collection. In 2017, Uber cut its autonomous driving crews down to just a safety driver.

Around the same time, Uber switched up its fleet of self-driving cars and opted to use fewer sensors than it had previously. While the older cars had seven sensors, the new SUVs only had one mounted on the rooftop of the vehicle that may have created a three-meter blind spot.

The company was also having problems with its new vehicles. Drivers had to ā€œinterveneā€ and prevent traffic violations roughly every 13 miles. In comparison, one of Uberā€™s competitors, Waymo, reported that its vehicles could drive an average of 5,600 miles before it was necessary for the driver to intervene.


“Drivers had to ‘intervene’ and prevent traffic violations roughly every 13 miles”


An Uber Self-Driving Car Crashed Into Elaine Herzberg

At approximately 10:00 p.m. on March 18, 2018, Elaine Herzberg was walking her bicycle across a street. She was not at a crosswalk. In the video released by police, the Uber safety driver appears to be distracted in the moments before the collision. She only looks up (from what appears to be a device) seconds before the vehicle strikes Ms. Herzberg. The driver reported that Ms. Herzberg appeared out of nowhere and that she did not have an opportunity to react.

It also appears that the driver did not apply the brakes until after the impact. The self-driving car did not attempt to slow down, veer, or brake immediately before the collision. Afterward, Ms. Herzberg was transported to a hospital where she subsequently died from her injuries.

Uber Immediately Enters a Confidential Settlement with Ms. Herzbergā€™s Family

On March 28, 2018, Uber announced that it entered a settlement with Ms. Herzbergā€™s family. Ms. Herzbergā€™s husband and daughter have indicated that they consider the matter closed and will not comment on her death or Uberā€™s involvement. While the terms of the settlement are confidential, we can safely assume that Uber offered a significant amount of money to avoid a lawsuit and additional public scrutiny.

While the settlement may offer the Herzberg family closure, itā€™s bad for the rest of us. A lawsuit would have:

  • Given the public much more information about potential problems with Uberā€™s technology
  • Encouraged the self-driving car industry to reexamine its safety protocols
  • Helped voters and lawmakers to understand how to build regulations that would protect motorists and pedestrians

Unfortunately, Uber bought the Herzberg familyā€™s silenceā€”and weā€™ll have to wait and see what the ongoing federal investigations disclose.

What Claims Did the Herzberg Family Have Against Uber?

Car accident lawyers and technology experts are speculating about why the Uber vehicle failed to stop or avoid the collision. At Crosley Law Firm, we see several issues at work in the crash: driver negligence, Uberā€™s failure to protect, and product liability.

Driver Negligence and Uberā€™s Failure to Protect

Itā€™s not hard for people to lose attention. Scientists have studied the ā€œdiligence decrementā€ since at least World War II. According to this well-tested scientific theory, people quickly become distracted when they engage in boring, repetitive tasks. In one recent study, most drivers became distracted within 20 minutes of driving on a flat section of straight highway.

Uber drivers were reportedly riding in self-driving vehicles for eight-hour shifts (frequently at night). The cars would drive on preassigned routes, repetitively circling the same path over and over. The safety driverā€™s job was to sit in the vehicle as it looped through the streets and keep an eye out for potential danger. Under these circumstances, boredom and reduced diligence are inevitable. It also didnā€™t help that Uber cut their driving teams down to one person.

RELATED ARTICLE: Distracted Driving: The Dangers of Hands-Free Technology

The reaction of the driver during the Uber collision is one of pure panic and horror. Itā€™s clear she wasnā€™t trying to cause a crash. However, she does appear to have lost interest in the route her vehicle was driving and didn’t even appear to be looking at the road in the moments before the impact. If the Uber claim had gone to trial, lawyers might have argued that both the driver and Uber were negligent: one became overly distracted, and the other failed to establish safe procedures that minimized the diligence decrement.

Product Liability

Todayā€™s self-driving cars rely on a complex combination of hardware and software. They use cameras, GPS, and LiDAR (light detection and ranging) sensors to scan the road for lane markings, other vehicles, and potential hazards. A series of complex computer algorithms interprets this data and makes driving decisions based on the information received. If a single system malfunctions, or is poorly designed in the first place, the results can be catastrophic.

Fingers are being pointed at both Uber and the hardware manufacturers. Many of Uberā€™s competitors are claiming that their vehicles would have responded appropriately and could have avoided Ms. Herzberg. The public probably wonā€™t know the whole story until the National Highway Traffic Safety Administration (NHTSA) and National Transportation Safety Board (NTSB) complete their investigations.

However, as autonomous vehicles become increasingly available, personal injury lawyers will shift their attention away from negligent drivers toward product liability claims. Today, driver error causes the majority of motor vehicle accidents. As drivers become less involved with their vehiclesā€™ operations, they will be increasingly less to blame for crashes and collisions.

Instead, weā€™ll focus our liability claims on the companies that design, manufacture, install, and maintain vehiclesā€™ hardware and software systems. Itā€™s telling that there are already insurance companies offering policies that cover autonomous vehicle liability.

Uberā€™s decision to use fewer sensors may have also been a factor in the crash. With fewer sensors, a three-meter blind spot may have been created. And itā€™s possible that Ms. Herzberg was in a sensor blind spot immediately before or during the crash. Lawyers would have carefully examined Uberā€™s sensors, processors, and software algorithms for design and manufacturing defects.

But what about Ms. Herzberg? Itā€™s unclear if her jaywalking at night would have impacted the claim. (If a traditional car had hit Ms. Herzberg, the driver would certainly use this as a defense.) Unlike a human driver, though, we expect automated systems on self-driving vehicles to constantly scan the streets for hazards. They are designed and touted as safer than a human driver. If the software and hardware failed or was poorly designed, a lawyer might argue that Ms. Herzbergā€™s jaywalking was a non-issue and that Uber was strictly liable.

Regardless, if Ms. Herzbergā€™s claim had gone to court, it would have been complicated, requiring multiple expert witnesses, accident reconstruction, courtroom presentations, and more. Her family would have needed a skilled lawyer with significant knowledge about driver negligence, product liability, and the developing world of autonomous vehicles.

Crosley Law Firm: Dedicated Personal Injury and Product Liability Attorneys

At Crosley Law Firm, we have extensive experience with car crash and product liability claims. Our lawyers use advanced technology and collaborate with teams of experts when we prepare and present our clientsā€™ claims. If you or a loved one has been injured in a car crash, please complete our online form or call us at 210-LAW-3000 | 210-529-3000 for a free, no-risk consultation.

References

Davies, A. (2018, March 24). The unavoidable folly of making humans train self-driving cars. WIRED. Retrieved from https://www.wired.com/story/uber-crash-arizona-human-train-self-driving-cars/.

Flahive, P. (2018, February 28). Self-driving cars may soon be tested

on San Antonio streets. Texas Public Radio. Retrieved from http://tpr.org/post/self-driving-cars-may-soon-be-tested-san-antonio-streets.

Marshall, A. (2018, March 21). Uber video shows the kind of crash self-driving cars are made to avoid. Wired. Retrieved from https://www.wired.com/story/uber-self-driving-crash-video-arizona/

Morris, D. (2017, February 26). Uberā€™s self-driving systems, not human drivers, missed at least six red lights In San Francisco. Fortune. Retrieved from http://fortune.com/2017/02/26/uber-self-driving-car-red-lights/.

Scism, L. (2018, March 21). Insurers race to develop coverage for driverless cars. The Wall Street Journal. Retrieved from https://www.wsj.com/articles/uber-accident-wont-stop-insurers-from-racing-to-develop-coverage-for-driverless-cars-1521624600.

Wakabayashi, D. (2018, March 23). Uberā€™s self-driving cars were struggling before Arizona crash. New York Times. Retrieved from https://www.nytimes.com/2018/03/23/technology/uber-self-driving-cars-arizona.html.

The contentĀ provided isĀ for informational purposes only and should not be construed as legal advice on any subject.