Skip to main content
Tesla logo written out in white on a black background.

Tesla Is Feeling the Pressure From NHTSA About Their Autopilot Accidents

However, it might be too soon to add another feature to a program that's riddled with problems. According to Ars Technica, the NHTSA is finally cracking down on Tesla and its Autopilot malfunctions. How much damage has been caused by Tesla's Autopilot, and how does the NHTSA seek to prevent further harm?

Vehicles like Tesla Model S now offer Full Self-Driving mode, updating Tesla’s existing Autopilot program. It’s an adaptive cruise control system that relies on cameras instead of radars, which Tesla says are more accurate. An AI utilizes the camera views to make decisions based on its surrounding, much like human drivers use their eyes.

However, it might be too soon to add another feature to a program that’s riddled with problems. According to Ars Technica, the NHTSA is finally cracking down on Tesla and its Autopilot malfunctions. How much damage has been caused by Tesla’s Autopilot, and how does the NHTSA seek to prevent further harm?

The problems with Autopilot

Tesla logo written out in white on a black background.
Tesla | Getty Images

Tesla’s Autopilot has been under scrutiny since 2016 when a driver was fatally injured while the program was engaged. While the system reminds you to keep your hands on the wheel, it was too lax. During an investigation, the system indicated that the driver only heeded its warning for 25 seconds out of 37 minutes of driving. 

Since then, Autopilot has been modified to disable itself after the driver ignores its warnings. Still, that didn’t stop another death in 2018, when a driver crashed into a barrier after ignoring Autopilot alerts. After reviewing multiple incidents, the NTSB reported that Autopilot encourages driver distractions and needs a massive overhaul.

The NHTSA is investigating Tesla

The most recent investigation comes after numerous Autopilot accidents involving emergency assistance vehicles. There have been twelve new incidents where a Tesla vehicle has crashed into these cars with Autopilot engaged. One person was killed as a result, and 17 people sustained minor to severe injuries.

The NHTSA is rightfully confused about how a supposedly advanced camera system can miss these vehicles. The cars in question are often large enough to transport multiple people, plus have flashing lights and sirens. The system also apparently can’t detect workers wearing reflective vests on the side of the road. 

The data collected from these incidents call into question how Autopilot can operate in everyday situations. It likely means that it could cause the driver to hit a pedestrian at night, even with reflective clothes. There were also incidents where the Autopiloted-Teslas hit other cars parked along the work zone or accident site. The autopilot cameras reportedly have night vision, but their behavior during these incidents says otherwise. 

The investigation includes all four of the automaker’s current offerings, totaling 765,000 units that were built between 2014 to 2021. Tesla has until October 22 to cooperate or ask for an extension. Failure to do either may result in a $114 million fine. Additionally, if the NHTSA finds that Tesla is at fault, the company will have to replace some of Autopilot’s components.

What new requirements will the NHTSA impose on Tesla?

According to the NHTSA letter, the organization wants data on how Autopilot recognizes objects in low-light situations and how the Autopilot detects accidents up ahead. Additionally, officials want Tesla to be more strict about where Autopilot can and cannot be used. The manual states that it should only be used on divided highways. However, the software makes no effort to prevent drivers from using it wherever they please.

Driver monitoring is still a huge issue that needs to be addressed. Currently, drivers can get away with ignoring Autopilot if they tap the steering wheel. The NHTSA wants better enforcement to make drivers pay attention and take the wheel when necessary. 

Elon Musk has previously stated before that AI response times are quicker than humans. He believes it might be less safe to have a human in control during an emergency by that logic. While this might be true, Autopilot isn’t refined enough to prevent major accidents.

Related

Drivers Without Tesla Autopilot Were Nine Times More Accident Prone in 2020