For many people, Elon Musk’s Tesla is the future of the auto industry. Every Tesla model is a reflection of that idea as every Tesla comes loaded with a lot of high-tech equipment that few other automakers put in their cars. One of those features is Tesla’s Autopilot, but unfortunately, that has had its share of issues lately.
The 2017 Model X crash
As Car Complaints wrote, in late 2017, Wai-Leung Chan was driving his Model X in the Long Island Expressway. He said that the traffic at the time was dense but slow, so he claimed that he turned on his Tesla’s Autopilot system. He said that his Model X’s Traffic-Aware Cruise Control and Autosteer functions were turned on at the time and that its following distance was set at “3.” He also said that he was alert behind the wheel.
He was trailing behind a tractor-trailer when another car started to merge between the tractor-trailer and the Model X. The Model X decelerated at first, but then, when the other car was merging, the Model X accelerated toward it. Chan claimed that his Model X didn’t recognize the other car that had merged between him and the tractor-trailer.
Chan said that he had about a second to react since the Model X’s automatic emergency braking didn’t engage. He steered his Model X away and crashed into two other cars before coming to a full stop. Due to the low speed of this incident, nobody was injured as a result of this accident. Car Complaints reported that Tesla said that Chan’s Model X functioned properly and that the accident was his fault.
Not the first Tesla Autopilot issue
While nobody was seriously hurt in this accident, the same can’t be said for other Autopilot-related accidents. For example, earlier this year, Bloomberg reported that the family of a Japanese man who died in an accident involving a Tesla has sued Tesla. They blame Tesla’s Autopilot for causing the car to hit him. If the family is correct, then Bloomberg said that this may be the first deadly accident involving Autopilot and a pedestrian.
Autopilot has been blamed for killing Tesla drivers too. Of course, many of these cases and incidents are still in the courts, so nobody can say for sure that Autopilot can be blamed. Regardless, though, these accidents do shine a light not just on Tesla, but also on the concept of the self-driving feature entirely.
Tesla is a long way from a true autopilot
While Tesla calls its self-driving feature Autopilot, it’s not truly an autopilot. In fact, it’s not factually accurate to call such features “self-driving,” since there are actually multiple levels of self-driving features, and Autopilot isn’t even near the highest level. According to Car and Driver, there are six levels of self-driving, and they range from Level 0 to Level 5.
Level 0 just means that there is no self-driving feature, and most cars on the roads are like this. Level 1 means that the car will give some driver assistance, and this level can be achieved by getting adaptive cruise control on your car. Level 2 will add some additional features, and this is where most self-driving systems, including Tesla’s Autopilot, is at.
Level 5 is what fully self-driving cars are, and currently, no cars are like this on the road. Tesla still has a long way to go before it can get its cars to that level, and in the meanwhile, accidents like what Chan experienced will happen. That’s why, unsurprisingly, as Car Complaints said, Tesla has put a lot of disclaimers in its owner’s manual about Autopilot, and one of those disclaimers is that Autopilot is still in testing and that drivers still need to be aware of the road.