With rapid advancements in technology, the future of the human race is autonomy, or the domestic and industrial application of AI, or Artificial Intelligence. A major upcoming industry with AI at its core is autonomous vehicles or self-driven cars. This technology holds immense potential for the transportation industry with lesser risk to human life and reduced logistical costs.
However, self-driving cars still face many obstacles today, technologically, ethically, and legally. While it’s easy to implicate a human driver, it becomes a little more complex when it comes to self-driving cars.
In this article, we’ll discuss the regulatory landscape of the technology and how it’s progressing towards large-scale deployment.
The Need to Regulate Self-Driving Cars
As the world is moving on to Level 3 of driver automation, there’s a pressing need for more laws to be enacted concerning autonomous vehicles. The core issue here is: who do you hold liable in a car accident without any driver? And even with a robust machine-learning algorithm, what influences the decision-making capabilities of autonomous vehicles?
These scenarios require a robust regulatory framework before AVs can be fully deployed in real-world applications.
Here are some of the risks associated with self-driving cars:
Driver liability will arise in cases where a human was in control of the vehicle (whether in-person or remotely) when the accident took place. Such a situation will place a direct liability on the human driver for all damages or loss of life incurred due to the accident.
Any damage caused by the driver which results in injuries or death as a result of the accident makes them liable to cover up all medical expenses of the victims involved in the car crash.
Damage to Property
The driver or controller of the vehicle will also be held liable for any damage caused to property as a direct result of an accident caused due to their fault. If the accident is a result of the negligence of another driver, you’ll not be held responsible for any property damage.
Other damages that relate to vehicle accidents, like loss of income suffered by the victim, future living costs, etc., have to be borne out by the driver if the injury was a direct result of an accident while they were in control of the AV.
Self-Driving Car Regulations in the US
The US has emerged as the top market for AV and driverless car technology, with many big players such as Google, Tesla, and Uber testing their autonomous fleet capabilities in the US. So, what’s concerning is that currently, there are no federal laws or regulations that govern driverless cars in the country.
Thankfully, various states have enacted their regulatory framework that governs driverless cars and the liabilities that arise thereof. And California has been right at the heart of AV makers in California and carrying out fleet testing on open roads.
In 2018, California laid out its amended version of the Autonomous Vehicle Tester Program, which allowed the testing of autonomous vehicles without a driver inside the car. Since then, AV companies such as Waymo and Tesla have tested thousands of self-driving cars in the state without any kind of human intervention. However, they still have to abide by very strict reporting and safety procedures.
Liability for Driverless Car Accidents
The primary question regarding the liability of AV accidents is whether to hold the driver responsible or the manufacturer and software developers. As Level 5 autonomous vehicles can perform complex driving functions without any human intervention whatsoever, there’s an ideological stand-off between the owner of the vehicle and its manufacturers regarding liability.
The law regarding vehicle accident liability is clear in California; the driver has to pay and is liable to the extent of the damage caused by them due to negligence. But, the lines do get blurry when there’s no driver involved.
The same law regarding accident liability for humans driving the car has been conferred to AVs in a capacity. California law describes a driver as the operator of the vehicle, whether they’re present in the driver’s seat or are in charge of the autonomous systems. Like many other states, California holds the driver responsible if the car was driven with their consent, even if the vehicle was operating on its autonomous functions without human intervention.
The California Vehicle Code has thus mandated a fail-safe design for AVs, which transfers the control of the vehicle to a manual operator in case of any software failure. An in-built safety alert system warns the operator of any AV failure, and the operator has to manually operate the steering, brakes, and acceleration of the vehicle. If they fail to take control of the vehicle, they’ll be held liable for any accidental damages.
There’s precedent as well on this matter. In 2018, Uber’s Level 3 autonomous vehicle was involved in an accident resulting in the death of a pedestrian. It was found that the operator failed to take appropriate measures prior to the accident as she was involved on her mobile phone. The court awarded negligent homicide to the driver without placing any liability on Uber.
Google’s self-driving tech partner Waymo also reported that numerous accidents had been averted due to human intervention ever since it started its AV testing in 2018.
There’s also legislation that accounts for responsibility to manufacturers and autonomous software programmers. Some states regard autonomous systems as the driver, which makes manufacturers liable for any mishaps caused by their vehicles.
California’s product liability laws dictate that any manufacturer creating a defective product will be held liable for damages caused by their product. This means that the victim just needs to establish a fault in the operating system of the AV and not negligence on the carmaker’s part to hold them accountable.
Regulations on AV: The Road Ahead
Today, an accident involves two drivers, and the liability is sure to fall on one of them. However, things can get a little more complicated in a driverless world.
As of 2021, 29 states in the US have enacted laws pertaining to self-driven cars. However, the lack of uniformity and structure has led to gaps in determining liability.
Many legal experts have argued that ultimately, the burden is due to fall on the manufacturer’s shoulders rather than the driver’s. Although human error can be blamed in cars with a driver, the same can also be applied to autonomous software built by humans. A fault in their design that directly leads to an accident will make manufacturers liable for the damages caused.
However, this does not necessarily mean that the driver or owner of the vehicle is completely absolved. As more legislation starts governing AVs, insurers will more likely rely on the information provided by the car’s system as to the cause of the accident. This could place a direct liability on manufacturers, who will have the burden of proof to show that the fault wasn’t in their design and blame the negligence on the driver.
Instances where the driver took the car out of autopilot or was operating the vehicle even after maintenance and service warning by the car’s systems, will make them directly liable for the extent of damages caused by their negligence.
Hire Experienced Accidental Injury Attorneys in California
Whether the car is operated by a human driver or by AI, the liability as to who’s at fault needs to be determined in order to compensate the victims of an accident fairly.
If you’ve been involved in a car accident or know someone who has, contact Niral Patel Injury Lawyers to get fair compensation with regards to any damages, medical expenses, loss of remuneration, and living expenses incurred. Our experienced team of attorneys is well-versed in Federal and Californian personal injury laws and can win you what you rightfully deserve.