Skip to main content

Tesla seems to be having trouble with its full self-driving features. Not too long ago, Tesla fired one of its employees, named John Bernal, for posting a video of the full-self-driving beta system striking a cone in San Francisco. Since being fired from the company, Bernal has doubled down on showing details of the Tesla self-driving software and its bugs.

Tesla’s object detection and collision prevention system have some serious issues

Red 2022 Tesla Model 3, a car that's loved and hated, with mountains in the background
2022 Tesla Model 3 | Tesla

In a recent video Bernal posted on his Youtube channel called AI Addict, he puts his own car on the line.

“Recently, I was scrolling through my phone, and I saw that the IIHS awarded Tesla a Top Safety Plus pick for its AI vision and for its ability to detect objects and slow and avoid them. With that, I thought, hey, I have a Tesla; it relies on AI vision, so let’s throw objects in front of my car and see how it performs,” said Bernal.

To put the Tesla self-driving system to the test, Bernal enlists a series of items to put in front of it. These include an orange bucket, a shipping pallet, an office chair, a garbage can, a beer keg, a grill, and an entire truck.

Upfront, it is worth noting that this is Tesla FSD and not FDS Beta, which allows beta testers to use more advanced features that are still in development. Bernal lost his access to FSD Beta when Tesla fired him.

Immediately, the testing does not get off to a good start. You can see clearly on the center screen of Bernal’s Tesla that it detects there’s an object on the road. The orange bucket appears on the screen as a traffic cone. However, despite the detection, the Tesla makes zero effort to stop and smashes into the bucket.

That doesn’t fare well for Tesla FSD. Bernal confirms that both autosteer and the emergency braking are enabled in his car. So, the fact that the car made zero effort to stop or avoid the bucket seems to point toward a flaw in the system.

Unfortunately, the result repeats itself throughout the rest of the testing

Bernal gradually makes his way through the objects. Bernal consistently placed them in the middle of the road in the same spot throughout the testing. Additionally, in the case of the pallet, Bernal tested with the pallet both falling over as he approaches it and sitting stationary.

Despite the objects increasing in size and the Tesla detecting them and even displaying them on screen, it makes no effort to avoid the objects throughout the duration of the testing. Bernal steers violently to prevent striking the bigger objects and causing severe damage to his car.

One of the more surprising failures is the Tesla entirely failing to detect when Bernal’s friend pulls out in front of him in the Ford Ranger. Bernal had to fully brake and stop the car himself. Worse yet, multiple times throughout the Testing, the Tesla displayed the objects on the road as pedestrians and made zero effort to stop anyway.

There are plenty of comments regarding how this results from the switch from LiDar technology to AI. Tesla CEO Elon Musk stated that the AI vision system was more successful in testing than the LiDar systems. However, this video seems to show the opposite being the case.

Is this testing legit?

There is one point of view to consider, though. It wouldn’t be all that difficult to show pictures of the automatic braking and steering systems checked on as Bernal did in the video and then turn them off for the testing. Certainly, there’s no accusation being made. However, it’s easy to see why one might think Bernal has enough of a chip on his shoulder after his firing to go out of his way to paint the self-driving system in a bad light.

That being said, Bernal was showing the real functions of FSD and FSD Beta during his employment with Tesla, so there’s no real reason to assume he’d stop now.

After all, most Americans don’t think that self-driving cars are safe as of yet. Despite the option to buy this self-driving system on a massive scale, it seems that the hunch most Americans have may be based in truth.

As always, the technology is in development, and there’s a very good reason that Tesla wants to be sure drivers are still actively paying attention behind the wheel. If nothing else, Bernal’s video should serve as a reminder to Tesla owners. The name “Full Self Driving” is quite misleading because that is far from what it is.

Stay safe and alert. Don’t rely on your car’s safety systems. Just hope that they’re there to back you up.

Related

Are Self-Driving Cars Doing More Harm Than Good in Early Stages?