Tesla Autopilot is under fire. Again. This time, rather than a few journalists relaying stories from behind a keyboard (hi), things are a lot more serious. The National Highway Traffic Safety Administration (NHTSA) opened a formal investigation into Tesla’s proprietary adaptive cruise control software. Unfortunately, this comes after a series of accidents, some of them fatal. We’ve heard about these stories before, and admittedly, I’ve covered several of them. But now, Tesla is in real hot water with the Feds.
The brand’s entire model range is subject to inquiry
So, what exactly brought the Feds down on Tesla Autopilot specifically? Unfortunately, there’s been a series of accidents, 11 to be exact, that the NHSTA says involved a Tesla and first responders while Tesla Autopilot was active. Evidently, the NHSTA doesn’t take kindly to emergency personnel being hit by cars that “drive themselves” until it’s legally convenient that they don’t. Now, I say that because Tesla has a history of shifting the responsibility of their computer software to owners when it suits them.
This behavior has caught the attention of the NHTSA, per the Associated Press. Unfortunately, the probe covers the entirety of Tesla’s lineup, some 765,000 vehicles. Not so S3XY. However, it’s important to note that Tesla, and by extension, Elon, are innocent until proven guilty. The investigation is ongoing, but the National Transportation Safety Board (NTSB) has recommended that the NHTSA require Tesla to limit the areas in which Tesla Autopilot can be used.
Tesla Autopilot clearly isn’t a substitute for real people
Per the AP, the NTSB has taken shots at Tesla before, blaming the company for numerous other accidents. However, I have to note that it isn’t just Tesla Autopilot that’s to blame here. People aren’t very responsible with the semi-autonomous software. They’ve done everything from sleeping to “driving” drunk in their Teslas. However, most of the time those people are punished. The NHTSA is now trying to determine if Tesla shares blame there too.
In some ways, they should. There’s a culture around Tesla models that’s become rather damaging. From r/tesla on Reddit to Musk’s Twitter feed, the cars are championed as something out of iRobot or Bladerunner when they simply aren’t. Tesla Full Self-Driving is not full self-driving. It’s a beta at best. Frankly, humans aren’t responsible enough for self-driving cars. And that’s what seems to have spurred on the NHTSA’s investigation.
How can the brand recover from this?
So, it’s pretty clear Tesla has somewhat of an image issue right now. I struggle to think of another automaker whose image has been so marred by public controversy post-Dieselgate. Frankly, Tesla has to start with some transparency. Changing some names would be a good start. “Tesla Autopilot” is somewhat misleading. GM’s “SuperCruise” adaptive cruise control software does not connotate self-driving, whereas “Autopilot” does. It would be a start to be sure, but we’ll have to see what the NHSTA says for any real changes.