Skip to main content
  • A Tesla Model 3 using Tesla Full Self Driving software nearly hit a cyclist
  • The issue of Tesla’s Full Self Driving has become a politically charged debate
  • Government agencies have exhibited a hesitancy to force sweeping change on Tesla

“Are we gonna have to cut that?!” Galileo Russell asks Omar. Omar runs the Twitter account @wholemarsblog. He’s the guy in the passenger seat of the Telsa Model 3 being driven by Russell. The Model 3, running Tesla Full Self Driving just had a bit of a slip-up. The cyclist in the image below was nearly struck by the EV. Thankfully, Russel snagged the wheel before anything went wrong.

The moment before a Tesla Model 3 nearly hit a cyclist in downtown San Francisco
The Model 3 with Full Self Driving engaged | HyperChange via Youtube

Tesla Model 3 Youtube video shows FSD is still dangerous

The video in question takes place in downtown San Francisco. Per the duo, this is where Tesla’s controversial software is at its best. Russell, who runs the Youtube channel HyperChange, is doing a podcast with Omar while the two cruise around on Tesla’s FSD software. I watched about 18 minutes of the video leading up to the incident. During which, Tesla’s software performed fairly well. Moreover, the two paid attention to the Model 3 the entire time. In short, it’s exactly what you’d want to see from something like this. Almost.

The Model 3’s screw-up comes with comedic timing Jerry Seinfeld would be jealous of. “With the software update, you can actually make thousands of people drive safer, just with a software update overnight.” That’s when the Tesla Model 3 jumps into the bike lane. The EV misses the cyclist, but only because Russell took control. If a person had been driving, anyone would call it dangerous. Instead, the two briefly contemplate cutting the whole thing out of the video before saying “it definitely wouldn’t have hit him,” and “he didn’t even notice.”

The culture surrounding Tesla Full Self Driving is an issue

This is part of the issue with Tesla Full Self Driving. Like the pandemic, self-driving has become politically charged. With that comes some, well, strong opinions. And inevitably, personal attacks, some of them coming from Omar’s Twitter account. The culture surrounding something that ought to be objectively evaluated has become a “right vs. wrong” fight. Inevitably, whichever side is “right” or “wrong” changes depending on who you ask. People like Omar and Elon Musk think that incidents like FSD Teslas crashing in the same spot are just part of the beta-test process.

On the other side, industry professionals and journalists believe that the software is dangerous. The majority of that side of this fight comes from the fact that Omar, Russell, and others are allowed to test the software on public roads, as well as the software’s numerous close calls and accidents. In essence, it’s an issue of qualification. Some think anyone should be able to test it as long as they pass Tesla’s Safety Score, and others think it ought to be left to employees of companies like Tesla or Waymo.

Regulatory measures against FSD still haven’t been taken

Elon Musk leaving a federal court hearing surrounded by police and security
Musk leaving a federal court hearing | Spencer Platt via Getty Images

The NHTSA has given Musk slaps on the wrist, like the update that removed the “rolling stop” function from Tesla FSD. But the U.S. Government is obviously hesitant to make a ruling. Obviously, that’ll set precedent for who’s liable when the next crash happens. We’ll simply have to wait and see how the issue develops, hopefully without further harm coming to anyone.

Related

Tesla Took Steering Components Out of Its EVs Without Telling You