Is Tesla Or Drivers At Fault For Mishandling Autopilot & FSD Features?

Is Tesla Or Drivers At Fault For Mishandling Autopilot & FSD Features?

There have been several accidents and investigations into Tesla’s Autopilot and Full Self-Driving Beta features, but who is at fault, the company or the driver? The Autopilot feature is a driver assistance system that can steer, accelerate and brake within its lane. It also allows drivers to set the cruise control to keep up with traffic. There’s also an enhanced Autopilot that can do more things like changing lanes, parking and bringing the car to you, but there are caveats to all of these features. Lastly, Full Self-Driving includes all this and can identify stop signs and traffic lights. But, of course, the driver must be alert while using all of these features.

In July 2022, the National Highway Traffic Safety Administration opened investigations into crashes that resulted in deaths. One was with a 2018 Model 3 and a 2015 Tesla. The former is believed that Tesla’s driver-assist technology was being used, but it’s unclear with the other one. The NHTSA is investigating another incident that happened during the same month. A Tesla crashed into a motorcycle, killing the rider. In addition, phantom breaking was another issue the organization started investigating in Feb. 2022. These are just some of the investigations going on.

California’s Department of Motor Vehicles investigation into Tesla’s FSD system has just resurfaced. It’s been going on since 2021. The agency alleges that Tesla misled consumers about how the systems worked. The organization claims that Tesla’s marketing on its website said, “All you will need to do is get in and tell your car where to go…Your Tesla will figure out the optimal route, navigating urban streets, complex intersections and freeways.” It sounds like the future is already here. While the advertisements may be misleading, the support pages on the Tesla website more accurately describe how these systems work. Of course, it should be upon the driver to read the instructions before operating technology like this. But, not everyone reads directions, so that’s where the line gets hazy.

Who Should Be Held Accountable For Mishandling Driver-Assist Technology?

Is Tesla Or Drivers At Fault For Mishandling Autopilot & FSD Features?

As mentioned above, crashes have occurred while using the Full Self-driving technology. The driver is typically held liable for a collision depending on the circumstances. Some of these Tesla drivers give complete control over to their vehicles instead of staying alert, as Tesla’s support website instructs. There’s clearly a disconnect between what this technology is capable of and what people think it’s capable of. That should fall on Tesla to adequately explain the features of its EVs. While the threat of Tesla being banned from selling cars in California is there, what will most likely happen is that the company will be mandated to educate its customers on these autonomous driving features. They should have been doing this already, though. These are cool features to advertise, but its first priority should be keeping its customers safe.

If mandating proper education on how to use these features doesn’t work. California’s DMV will have to decide whether to ban Tesla from selling vehicles in the state or focus its attention on individuals who misuse the technology. Of course, not every Tesla driver mishandles the Autopilot and FSD features. There have been many reports of incidents, but compared to how many Tesla vehicles there are, it’s not as bad. It’s still something that should be addressed for the drivers’ safety, though.