Tesla has admitted that the autopilot system does not work correctly in heavy rain or when something obstructs the ultrasonic sensors. Elon Musk has even stated that the “FSD Beta 9.2 is actually not great imo.” There have also been several incidents where drivers have misused the autopilot software.
One such incident occurred in 2019 when a Tesla crashed and burst into flames and killed both occupants of the vehicle, neither of whom were sitting in the driver’s seat.
There have been many problems, particularly accidents, that it has prompted an investigation by the National Highway Traffic Safety Administration (NHTSA) into Tesla’s Autopilot software. And if the NHTSA finds enough evidence of safety issues, it could even trigger a recall.
What exactly is Tesla’s Autopilot?
Contrary to its name, Autopilot is just a driver-assist system. The Autopilot system features include lane-keeping assist, automatic emergency braking, adaptive cruise control, forward collision warning, and automatic lane change. It also has Autosteer, Autopark, and Summon, which allows you to move your Tesla in and out of tight spaces without being in the car. Unfortunately, there have been several incidents of the autopilot functioning incorrectly, leading to accidents.
Many of these incidents involve first responder vehicles, and many have happened after dark. According to Electrek, Tesla’s Autopilot system has been known to have problems, specifically detecting and stopping for stationary objects on the side of the road. It likely is more of a coincidence that first responder vehicles have been involved in these crashes because they are often stopped on the side of the road. There is no bug in the system explicitly focusing on first responders.
The NHTSA investigation
The NHTSA’s investigation began in August after 11 crashes between Teslas and first responder vehicles. In each case, the NHTSA determined that Autopilot was being used before these collisions. And according to Business Insider, a 12th crash was added to the investigation after a Model 3 struck a Florida Highway Patrol car that was stopped on the side of I-4 on August 28th helping a disabled motorist. These 12 crashes led to 17 injuries and one fatality.
According to CNBC, Tesla must turn over data from the company’s entire Autopilot-equipped fleet, not just the 12 vehicles involved in the first responder accidents, encompassing cars, software, and hardware Tesla sold from 2014 to 2021, to the NHTSA by October 22nd. And if the NHTSA finds enough evidence of safety issues, the agency can mandate a complete recall of Tesla cars equipped with Autopilot.
Tesla’s new ‘Full Self-Driving’ feature
Again, contrary to its name, Full Self-Driving (FSD) is an advanced driver-assist system. It is basically an upgraded version of Autopilot that requires a monthly subscription. According to Road Show, it is in beta testing, and the 10th version of FSD beta is being rolled out. Currently, about 1,000 employees and owners are participating in the beta testing for FSD.
Is Tesla’s Autopilot software unsafe? Many people argue that the software is safe, and the problems lie with the people operating it incorrectly. People often misinterpret what “Autopilot” means, and perhaps that is the problem. Maybe the recommendation of the NHTSA from the investigation will be to change the name to something less misleading. Of course, it is probably safe to say that Musk would likely not agree with that recommendation.