Skip to main content

It’s one thing to run a red light. It’s another when your car does it when you don’t mean to. That’s what federal regulators are trying to understand in a sweeping new probe into Tesla’s “Full Self-Driving” software.

The tech was recently modified to add the word “Supervised” in parenthesis. That happened after regulators called out Tesla for marketing a concept that wasn’t totally true in real-life driving.

The National Highway Traffic Safety Administration just opened a preliminary evaluation on October 7 into nearly 2.9 million Tesla vehicles equipped with FSD

According to the NHTSA, the investigation targets how the system handles basic traffic laws. Specifically, whether Teslas sometimes roll through red lights or swerve into the wrong lane while the software is engaged.

The probe covers the 2016–2025 Model S, 2016–2025 Model X, 2017–2026 Model 3, 2020–2026 Model Y, and the 2023–2026 Cybertruck.

All use variations of Tesla’s “FSD (Supervised)” and “FSD (Beta)” systems, which the automaker describes as Level 2 automation. Meaning the car can steer, brake, and accelerate on its own, but the driver must remain fully alert.

NHTSA’s Office of Defects Investigation (ODI) says it has received at least 18 complaints and one media report

The reports described Tesla vehicles that entered intersections against red signals, failed to stop completely, or misread the color of the traffic light altogether.

Six crash reports have already been filed under the agency’s Standing General Order system. 

The system requires automakers to report serious incidents involving automated systems. Four of those crashes led to injuries.

Investigators are zeroing in on repeated incidents at the same intersection in Joppa, Maryland

The location is a sign that the problem might be more than a fluke. According to ODI, Tesla has since made software changes to address behavior at that specific location.

The agency is also investigating a second pattern. Reportedly, Tesla EVs have been crossing double-yellow lines or drifting into opposing traffic when making turns or traveling straight.

In total, ODI has logged 24 complaints, 6 additional crash reports, and 3 media accounts

These describe vehicles entering oncoming lanes or taking incorrect turns despite visible signs and lane markings.

Some drivers claimed the software gave no warning before executing these maneuvers, leaving them little time to intervene.

Regulators plan to assess several concepts:

  • How much notice drivers get before FSD makes a questionable move
  • How effectively it detects and responds to traffic lights and lane markings
  • Whether system updates improve compliance with traffic laws

This latest probe joins two other active NHTSA investigations into Tesla’s driver-assistance technology. One is tied to a fatal crash in 2024.

As an autos journalist, I’ve come to have an opinion or two about Tesla FSD

Look, I’m all for new tech. I’m not against EVs in theory (although I have qualms with how and where some components are manufactured in the supply chain). The same can be said for any car, by the way, not just Tesla models.

But to market a partially realized idea (Full Self-Driving) and use consumers driving their friends and families, including children, to test it for your company, chalking up damage, injuries, and deaths to “Progress in Action” is absolutely abhorrent.

While I do believe in the “Good of Humanity,” I’m not sure U.S. drivers will ever be self-disciplined enough to responsibly control a car set to FSD en masse. Especially if “this” is the end product. Heck, too many folks already drive drunk and on drugs, MotorBiscuit covers loads of those stories. But why add to the fatality numbers when we can at least control this category?

Why launch software that makes it so easy to captain a NOT self-driving car while you’re thinking about anything but navigating traffic?

Maybe we’ll get there

In the meantime, I’m just so tired of reading about people and kids getting hurt or killed in or by a Tesla.

And yes, people of all ages get hurt and killed in any manner of vehicle, over all sorts of stupid mistakes or unfortunate medical events.

But if you can’t open your car doors because in the moment, you don’t understand how to break out (or in) during an emergency, or the vehicle accelerates through red lights when it shouldn’t, all because you were psyched on the idea of Jetsons-esque tech, what are we doing here?

Maybe I’m flat-out wrong. There’s never been a perfect car. I doubt there ever will be. But at the moment, considering the current tech, I think that if you don’t want to pay attention to the road, have someone lucid take control, not a glitchy car that doesn’t quite understand how to stay put at a red light.

Related

First the Bronco Warthog, Now the Tesla Wolverine?

Want more news like this? Add MotorBiscuit as a preferred source on Google!
Preferred sources are prioritized in Top Stories, ensuring you never miss any of our editorial team's hard work.
Add as preferred source on Google