WHEN AUTONOMOUS VEHICLES MAKE AUTONOMOUS DECISIONS

The idea of an autonomous, or self-driving, vehicle is appealing to a lot of people. If you owned a car that could drive itself, you could take advantage of the luxury to enjoy your daily commute in other ways. You could get ahead on some work before you get to the office, or you could lean back and relax until the caffeine has had a chance to properly kick in. The possibilities are endless, but all that glitters is not gold. While having enthusiasm for modern technology isn’t a bad thing, we should all remember that with new technology comes glitches in the system. For those of us living in Florida, it is legal to use a self-driving vehicle, but we should keep in mind that no system is flawless. Unfortunately, this truth is playing out in the recent lives lost due to autonomous vehicle accidents.

WHAT ARE AUTONOMOUS VEHICLES? 

Autonomous vehicles are vehicles that can operate on their own. So many car crashes are due to human error or distraction every year, and it’s believed that autonomous vehicles are the answer to this problem. Some research suggests more autonomous vehicles on the road would mean smoother traffic, and that this type of transportation could save everyone as much as 50 minutes a day. Some major companies that produce self-driving cars include Tesla, Waymo, GM Cruise, and Argo AI.

ARE AUTONOMOUS VEHICLES LEGAL?

Yes, autonomous vehicles are legal. In fact, the State of Florida specifies that, “a licensed human operator is not required to operate a fully autonomous vehicle.” In other words, it is completely legal for an autonomous vehicle to drive without a human operator inside of it. However, state insurance requirements still apply.

ARE AUTONOMOUS VEHICLES SAFE? 

Most of the time, they are safe. However, there have been rising concerns over the autopilot feature in some Teslas. For starters, a study by McAfee Advanced Threat Research proved a Tesla could be tricked into misreading traffic signs. McAfee took a traffic speed sign that read “35 mph” and put a small piece of black tape near the middle section of the number 3 to extend the image a bit to the left. When the Tesla drove past this sign, it misread the sign as “85 mph” and immediately increased its speed. Since it was a planned experiment, McAfee was able to slow the car down immediately after it increased its speed automatically. Ideally, a self-driving car in this situation would have a human driver in the vehicle to regulate any errors like that, but some states don’t legally require that.

Another major concern is that the autopilot system will not always react quickly enough when the vehicle is operating on a major highway or interstate. There have already been a few fatal crashes for this very reason. Two of these crashes have happened in the last five years in Florida. In 2016, Omar Awan died in Davie, Florida when his Tesla crossed three lanes of traffic and crashed into trees on a median. The damage done to the vehicle’s battery caused the car to then catch on fire. More recently in 2018, Jeremy Banner lost his life after a semi-truck merged in front of his Tesla. Banner had turned on his autopilot feature 10 seconds before the crash happened. Though he hit his breaks, his vehicle still crashed into the side of the semi-truck and resulted in the roof being ripped off of Banner’s vehicle.

In California last month, Walter Huang died when his Tesla Model X crashed into a concrete barrier while on autopilot. Huang had recently complained to his wife that his vehicle always seemed to navigate toward the barrier when he set his car to autopilot mode on his way to work every morning. Huang’s family is now suing both Tesla and the California Department of Transportation.

WHAT TO DO IF YOU’RE IN AN AUTONOMOUS VEHICLE ACCIDENT

As time moves on, this technology will continue to advance and adapt to our needs. However, it’s crucial that we do not overestimate the power of the technology and allow it to replace our own sound judgement when it comes to safety. Some have expressed concerns over cybercriminals hacking into the operating systems of autonomous vehicles, but authorities have not seen any proof of this happening just yet. As we move forward, it’s wise to keep such concerns in our minds for consideration. If you or a loved one have been victimized from the use or improper use of an autonomous vehicle, reach out today to discuss your options for potential compensation. Technology is not fool-proof, but it’s not guilt-free either. You have a right to be compensated for any injuries sustained or damage applied to your possessions. Let a skilled attorney at Keller, Melchiorre, and Walsh help you navigate your next steps.