Tesla has been an innovative automaker for many reasons, but Tesla hasn’t only been innovating EVs. As many people know, Tesla has also been pretty infamous due to Autopilot, which is Tesla’s name for its semi-autonomous self-driving system. Cars with features like Autopilot are semi-autonomous because, ultimately, they are not true self-driving cars, and they can be tricked by things, including a particular road in Yosemite National Park.
The issues surrounding Tesla’s Autopilot
There are six levels in the world of autonomous driving (0 to 5). At Level 0, the car is not autonomous whatsoever, and these are the cars most people drive. At Level 5, the car is a true, self-driving machine. However, Tesla’s Autopilot, and many other automakers’ equivalent self-driving tech, are at Level 2 right now. This shows how far Tesla and others have to go before building a true, self-driving car.
As one may expect, since it isn’t Level 5, accidents have happened involving Tesla’s Autopilot feature. Some have been fatal, while others have been less serious but still not ideal. Some accidents have involved a Tesla hitting other cars or property, while others have involved a Tesla potentially hitting a pedestrian.
These accidents are happening for various reasons. For some, the driver may have been using Autopilot but not paying attention to the road as Tesla advised them to. For others, Autopilot was fooled by the circumstances in front of it. Indeed, according to Business Insider, that’s exactly the case with a road in Yosemite National Park.
That one road in Yosemite
Business Insider reports that a Tesla owner was driving his car in Yosemite National Park with Autopilot on. The driver was operating his Tesla correctly, as he kept his hands on the steering wheel the whole time. However, when the road forked and became two roads, the Tesla model decided to keep going straight.
Fortunately, the driver’s Tesla was only going 25 mph at the time. However, despite having his hands on the wheel, the driver could not correct this issue before the Tesla ultimately crashed itself into a boulder, according to Business Insider. On top of that, the owner was unable to pull the Tesla out himself. As such, it cost the owner about $1,000 to move his Tesla to a nearby Tesla Body Center for repairs.
That being said, this may not have been the first time that a Tesla has run into trouble at the same spot in Yosemite. Business Insider reports that the driver claimed that park rangers told him that three other Teslas crashed at the same spot in Yosemite. Additionally, he claims that a fifth Tesla crashed there shortly after his Tesla crashed.
Self-driving tech like Tesla Autopilot is complicated
Due to these many issues with Autopilot, it, along with similar features, is relatively controversial. They are not perfect systems yet, and drivers who don’t use them properly may pay the ultimate price. That is one of Tesla’s more common responses regarding Autopilot, as drivers still need to pay attention even with Autopilot engaged.
However, another common response from Tesla is that Autopilot has ultimately saved lives. Autopilot is more than just a semi-autonomous self-driving feature, as it can also act on its own accord if it believes that the driver is incapacitated. As such, drivers who are under the influence may get into fewer accidents with Autopilot.
Additionally, these issues with Autopilot aren’t exclusive to Tesla. Other automakers with similar systems have been involved in similar incidents in the past. As such, semi-autonomous self-driving systems like Autopilot aren’t perfect now, but in the future, they can be.