Autonomous Cars Crash, Too. So Who Is Responsible?
Hardly a day goes by when you don't read an article or see a television news story about driverless, or ‘autonomous', cars. Depending on who you believe, there are about as many people or entities that feel cars that drive themselves are a certainty as there are that predict there's no possible way this will ever happen.
But no matter what side of the discussion you are on, there is also a lot of speculation as to what would happen to the practice of personal injury law if in fact they become a mainstream method of transportation.
For those of you who are laughing and cheering with anticipation of the demise of a large number of lawyers, don't get too excited just yet. Currently, there are a number of disruptions to the legal business, including websites like LegalZoom and Nolo where consumers can supposedly go to get help with things like wills, divorce or incorporating a business. Frankly, those types of enterprises don't pose a threat to us, a Fort Myers-based personal injury law firm that helps people who have been seriously injured in some type of motor vehicle crash.
At Goldberg Noone Abraham law firm, we realize that the vast number of our clients come to us because a friend or relative has given them our name. We also realize that it would be foolish to stick our heads in the sand and ignore the fact that autonomous vehicles could become a reality, and in the not-too-distant future. Some may conclude that since the large majority of car and motorcycle crashes are the result of human error, driverless vehicles will put an end to the practice of personal injury law.
But we think the exact opposite may happen.
New Technology Means New Legal Challenges
In the event that driverless cars do come into our future, there will be a world of new challenges for the legal practitioner, the court system, jurors and a lot more. If an autonomous vehicle gets involved in an accident and there is no one behind the wheel, who is held accountable? The answer at this point is easy – no one knows. In some cases, as with Mercedes-Benz, Volvo and Google, developers and manufacturers of driverless cars say they will accept or assume the liability if one of their vehicles are found to have caused a crash. However, the down side would be if they investigate after the crash and determine there was nothing wrong with the car's systems, and therefor reject any responsibility.
Computers are not perfect, as anyone who has Windows 10 can attest. Computers and software can crash without notice, and even a little glitch can presumably affect the operating system of anything that is run by a computer. (We won't even get into the topic of someone hacking into a car's computer system, which has already happened). In the event of a crash involving an autonomous vehicle, there will be a whole new list of potential defendants brought into the liability picture. The people or company who developed the software, programmers, GPS mapping companies, developers of the algorithms used in the car's software or perhaps even Google itself – any or all of them may be in a position to be held accountable for the crash of an autonomous vehicle.
There are so many other variables that may be involved in the event of a driverless car crash. And by using the term ‘driverless', it should be pointed out that in the majority of autonomous cars, a person is still required to be inside the vehicle. What happens if the vehicle's occupant is found to be over the legal blood alcohol content limit after a crash – does that effect who will be held liable?
Government regulators are currently scrambling to put some new laws and regulations into place in anticipation of the arrival of driverless cars. The problem is, if each individual state is responsible for writing and enacting their own laws, we'll end up with a mish-mash of laws that will only add to the problem of crash liability and litigation.
Recently, there was a fatal crash in Florida involving a Tesla equipped with a ‘crash prevention' system. During a Senate investigation of the aftermath, a Tesla spokesperson admitted the crash prevention system failed, but not their so-called Autopilot system. The company told investigators that the automatic braking system failed, but apparently would not admit to that being a part of the Autopilot system. The New York Times reported that Tesla executives have pushed back against any governmental action that could slow the introduction of automated-driving technology, which should probably surprise absolutely no one. But, the story also points out a view held by Tesla that should scare the hell out of everyone:
Tesla expressed the view that while some deaths might occur as automakers developed and perfected these kinds of innovations, the safety benefits outweighed the risks.
Really? So, how many people need to die while Tesla tests their systems?
The bottom line is there is an unimaginable list of factors that will make their way into any future litigation involving a crash of an autonomous vehicle. This is completely new territory that, at least initially, appears will result in more questions than answers.
If more lives were saved by the introduction of driverless technology, which is what proponents have predicted, we'd be thrilled. Few things can be more devastating than dealing with the tragic aftermath of a serious or fatal car crash. It never gets any easier, helping loved ones of someone whose life was taken by the negligent or careless act of someone else. But, that's what we do. And we frankly don't see that coming to an end if self-driving cars start to fill our roads.
We've posted about autonomous cars previously. To read more, please click here.