Skip to main content

It is no secret that Tesla and Full Self-Driving haven’t had the easiest time. A recent New York Times article dropped a bomb that one of the original Autopilot / FSD videos might have been faked many years ago. While that may be true, there appears to be a lot of unanswered questions about the capabilities of Full Self-Driving floating around.

Elon Musk has always been confident in Tesla Full Self-Driving technology

Problems Between Former Tesla Employees, Elon Musk, and Full Self-Driving Remain
Problems between Tesla and Full Self-Driving remain an issue | Julian Stratenschulte/picture alliance via Getty Images

The New York Times article cites that the National Highway Traffic Safety Administration (NHTSA) uncovered this information during a recent investigation. There are open investigations into Tesla and Full Self-Driving after Autopilot was engaged during fatal accidents. There are 12 accidents under investigation right now after Tesla’s crashed into emergency vehicles. Due to these crashes, one person is dead, and 17 others were injured.

“As the guiding force behind Autopilot, Mr. Musk pushed it in directions other automakers were unwilling to take this kind of technology, interviews with 19 people who worked on the project over the last decade show.” 

The New York Times

The article notes that Elon Musk “misled buyers” about Full Self-Driving’s capabilities. While Tesla and Musk have always been adamant that the onus for driving remains on the driver, FSD technology seems to say something different.

In one video from 2016, Musk says, “The basic news is that all Tesla vehicles leaving the factory have all the hardware necessary for Level 5 autonomy.” That has been proved wrong multiple times, as the Society of Automotive Engineers defines Level 5 as full driving automation. No matter what is going on behind the scenes, Tesla is not there yet.

There is no denying that Tesla and Full Self-Driving is advanced

Since Telsa relies so heavily on the internet to interact with fans, drivers and get information out into the world, there is an endless supply of documented Tesla marvels. Former employees of the brand say that the video above was faked. The Model S shown in the original Autopilot and Full Self-Driving video hit a roadside barrier during the ride. There is also the allegation that other software helped the Tesla follow a pre-determined path. Tesla still uses the video to demonstrate how FSD works. Three former employees confirmed these details.

Now, 19 various former employees have come out to speak on the condition that all 19 remain anonymous. The group fears that Tesla and Musk could retaliate due to the information provided, as has happened in the past.

The New York Times says that regulators warned Tesla and Musk that Autopilot isn’t ready for the world as the company makes it seem. This has led to people misusing the technology. People have been in accidents while riding in the car’s back seat or while intoxicated in the front seat.

Is The New York Times article going to have an impact?

“Where I get concerned is the language that’s used to describe the capabilities of the vehicle. It can be very dangerous,” Jennifer Homendy of the National Transportation Safety Board chairwoman, said. Others who have worked on Autopilot in the past say that the constant updates can be confusing. Drivers might not know what the vehicle can or cannot do.

While Musk and Tesla have been emphasizing cameras instead of radar recently, it doesn’t seem to be paying off. “FSD Beta 9.2 is actually not great imo, but Autopilot/AI team is rallying to improve as fast as possible,” he said on Twitter back in August of this year.

After Joshua Brown and his Model S incident, Musk called a meeting. The Model S was using Autopilot when a tractor-trailer crossed in front of it. The vehicle had radar and cameras to facilitate the use of Full Self-Driving. Two former employees were in that meeting and said Musk needed to ensure the cars did not hit anything. Later, Tesla said the Autopilot camera could not distinguish between the white sky and a white truck. Beyond that, Tesla hasn’t explained why the radar did not prevent the accident.

For now, there do not seem to be any concrete answers one way or another. While the NHTSA investigates Tesla and Full Self-Driving, plenty of people remain on the road convinced the technology is working. If Full Self-Driving was as advanced as the videos want people to believe, why isn’t fully autonomous driving available right now?

Related

Tesla Safety Report for 2021 Q1: How Safe Is Autopilot?