Skip to main content
A crashed Tesla sits on a flatbed

Repeated Tesla Crashes Highlight Major Issues With Autonomy

Tesla has an image problem. Well, their autonomous vehicle software does. Much like the Coronavirus, people have become entrenched on one side of the issue or the other. Most believe that self-driving is either the future or a dangerous fantasy. Honestly, the truth is likely somewhere in the middle. However, a recent string of crashes while the brand's autonomous software has been in use highlight a concerning issue that'll be very difficult to work around.

Tesla has an image problem. Well, their autonomous vehicle software does. Hell, the AV industry as a whole does. Much like the Coronavirus, people have become entrenched on one side of the issue or the other. Most believe that self-driving is either the future or a dangerous fantasy. Honestly, the truth is likely somewhere in the middle. However, a recent string of crashes while the brand’s autonomous software has been in use highlight a concerning issue that’ll be very difficult to work around.

There were five Tesla crashes on the same road

The intersection in Yosemite that has claimed the lives of five Tesla models
The forked road that kills Teslas | Google Maps

This looks like a pretty standard intersection, no? Well, yes, but maybe not for a Tesla. This little intersection has claimed five Tesla lives. One of them was a Model X SUV that got turned into a life-sized model by a rock. It seems the brand’s autonomous software has some issues with this intersection. Reddit user u/BBFLG spoke about the experience.

In their post, BBFLG said that their Model X crash went something like this. “Hands on wheel, eyes on road, vehicle just wanted to keep going straight, I took control, entered gravel and smashed into a boulder.” Troubling to say the least. The user also states in the post that they’re aware of three other incidents via park rangers at the same spot. Clearly, something about Tesla software cannot cope here, and it points out a growing concern among industry specialists.

Industry experts are extremely concerned

The Autopilot display in a Model S sedan
The Autopilot display in a Model S | Chris Walker via Getty Images

One such specialist is Missy Cummings, who recently made an appearance on Matt Farah’s “The Smoking Tire” (TST) podcast to talk about our use of autonomous vehicles, like Teslas, and their software. Cummings is a professor at Duke University and the director at the university’s Humans and Autonomy Lab. Qualified to say the least. On the podcast Cummings spoke about one of the largest problems to face autonomous software: it just simply isn’t us.

Autonomous software, like that used for self-driving vehicles, must be programmed. It’s binary; ones, zeros, and variables. Human beings are inherently un-binary. We make decisions based on what we think, inferred from past experiences and instinct, the latter of which a machine simply cannot replicate. And that’s the problem, as she pointed out on TST. Cummings used the example of a snow-covered stop sign. A machine may not recognize that stop sign because it doesn’t look like the image it uses as a reference. Then, accident.

Will cars ever drive themselves?

A Reddit user u/BBFLG's crashed Model X in Yosemite National Park
A Reddit user’s crashed Model X | u/BBFLG via Reddit

Frankly, it’s hard to know if Cruise, Tesla, or some as-yet-unnamed party will ever find a way around that simple fact. We can code for as many variables as we want in autonomous software, but in the end, it’ll be extremely difficult to make AV software better than a fully alert, sober human at the wheel.

It’s the classic train car philosophy question. Do you let the train hit the people on the track, or let it crash? A human can make that decision, right or wrong, and it’s extremely difficult to get a machine to calculate the weight of human life when we’re barely capable of doing so ourselves. For now, driver assistance software will be just that: driver assistance, barring some massive breakthrough in machine learning. We’ll have to drive just a bit longer it seems.

Related

No, Driver Assistance and Self-Driving Are Not the Same Things