Once upon a time, self-driving cars were a thing of the future, an idea and an ideal to live up to, and something most people thought they would only see on the big screen. Yet, manufacturers like Toyota Motor Corp (7203.T) and chipmaker Nvidia Corp (NVDA.O) have developed technology that takes autonomous vehicles from idea to reality. Uber Technologies Inc. has also developed a self-driving vehicle, one that it felt confident enough in to test it out in public. Unfortunately, the test drive did not end so well.
On March 18, mother and wife, Elaine Herzberg, 49, was struck and killed by an autonomous SUV when she was crossing a street in Tempe, Arizona. Herzberg’s daughter and husband filed a legal suit against Uber Technologies Inc. for their loss. The company agreed to settle for an undisclosed amount.
In the wake of the accident, Uber, Toyota Motor Corp, and Nvidia Corp all agreed to suspend the testing of self-driving vehicles on public roads. Other manufacturers are awaiting the results of the investigation to see how they will respond.
Self-Driving Vehicles: Are They More of a Liability Than a Solution?
In the wake of the Uber accident, manufacturers, supporters, and naysayers of the autonomous vehicle initiative alike want to know, are self-driving vehicles more of a liability? It is the hope of many that self-driving vehicles will eliminate the possibility of human error and create safer roads for drivers. Some might say the mission is a success, while others, like the husband and daughter of Elaine Herzberg, might say it has been a complete failure. In fact, USA Today did a report three years ago about how self-driving cars are involved in crashes at five times the rate as conventional vehicles. Of course, the technology has come a long way since that report was created, but as the most recent incident suggests, the risks are still there.
Toyota North America Chief Executive Jim Lentz fights back against criticism, saying, “There will be mistakes from vehicles, from systems, and a hundred or 500 or a thousand people could lose their lives in accidents like we’ve seen in Arizona.” However, he says, the big question is this: “How much risk are they willing to take? If you can save net 34,000 lives, are you willing to potentially have 10 or 100 or 500 or 1,000 people die?”
Should Human Accountability Factor In?
Since autonomous vehicles have been introduced to main roads and begun causing accidents, consumers want to know, who is liable for accidents? Yes, the technology is supposed to make driving safer, but should self-driving cars be allowed to function completely free of human intervention? Though some manufacturers have tested vehicles free of humans, period, the technology utilized by Lyft—an Uber competitor—provides room for “pilots” who are “constantly monitoring the vehicle systems and the surrounding environment, and are ready to manually take control of the vehicle if an unexpected situation arises.” Footage from the March 18 incident shows that Uber Technologies Inc. has adopted a similar technology, yet, it was not taken advantage of. According to footage from the incident and ensuing reports, Herzberg was pushing a bicycle while walking across a four-lane road outside of a crosswalk when she was struck. The dash cam recorded the entire incident, both inside and out. As the vehicle struck Hertzberg, the “pilot” was mostly looking down. Though he was capable of preventing the accident by deploying his capabilities, he did not. As a result, a woman died.
Many want to know: should he be just as accountable as Uber itself, or should the technology company take full blame?
Because the answer remains unclear, many auto manufacturers are aiming to keep human drivers on the responsible side of the line. Major manufacturers such as General Motors, Cadillac, and Volvo have developed Pilot Assist programs, which allow for autonomy but that stipulate that human drivers must remain alert and ready to take over control if necessary. The definition of necessity remains purposefully vague, with dips in visibility, weather changes, or heavy traffic conditions being examples of when human intervention is necessary. Volvo has equipped its new release with touch sensors that ensure the human pilot remains engaged throughout the duration of the drive.
By the time fully autonomous vehicles become a reality, car manufacturers will have perfected the technology so well that they will be able to take human drivers—and liability—out of the equation entirely. Until that day, however, many are playing it safe, and those that are willing to test autonomous vehicles on open roads likely have protections in place that make the pilot at least partially responsible for any trips gone awry.
What to Do if You Were Hit By an Autonomous Vehicle
If you or a loved one was injured by a self-driving vehicle, you may be unsure about how to proceed. Should you sue the car company itself, file an insurance claim with the pilot’s insurance company, or both? Will the pilot’s insurance company even cover the damages, considering the pilot him or herself may not even have been at fault? There are a lot of questions that need answers. A skilled Orange County personal injury attorney can help you find those answers.
If you were hit by a self-driving car, reach out to RMD Law regarding your rights. Our team can help you perform an investigation and help you recover the compensation you need to recover in comfort. Call our office today to schedule your free consultation.