Tesla Autopilot has its benefit and drawbacks but offers help for the most part. While Tesla has been working hard to get Full Self-Driving up and running, it isn’t there yet. However, one intoxicated driver put far too much faith into the autonomous system, and it didn’t pay off. Is Tesla responsible for people using Tesla Autopilot in the wrong way? Not necessarily.
What does Tesla Autopilot do?
Tesla has a variety of features that assist drivers on the road. According to Tesla, “Autopilot advanced safety and convenience features are designed to assist you with the most burdensome parts of driving. Autopilot introduces new features and improves existing functionality to make your Tesla safer and more capable over time.” Additionally, Autopilot helps drivers steer, accelerate, and brake automatically.
The one big stipulation listed is that Autopilot still requires active driver supervision, and it does not make the vehicle autonomous. In other words, you still have to be an active participant in driving the car.
Is Tesla Autopilot self-driving?
Tesla Autopilot is not completely self-driving, regardless of what the title might infer. Over the weekend, another person spotted a Tesla Model S driver sleeping behind the wheel in Norway. It was clear he was not awake and paying attention behind the wheel. Autopilot was steering and guiding the car. Presumably, the driver input the destination after getting in the car. He then decided to nap. Electrek reported that the driver was intoxicated.
Other drivers on the road witnessed this behavior and tried to wake the driver, but he was fast asleep driving down the road. One of the main features of Autopilot is that even though the car is driving, it still asks for driver input and control occasionally. If a situation arises, the vehicle will ask the driver to overtake and operate on its own.
Tesla Autopilot was chiming in and asking for a response from the driver of this Model S. Eventually, since it did not get a reply, the vehicle stopped in the middle of the road. The police determined the 24-year-old man was intoxicated. The police gave a statement that confirmed the speculation.
“At 0540; a Tesla stops in the tunnel. It turns out to be a man 24 years old who has fallen asleep behind the wheel. He is also drunk, but stubbornly denies driving. Although there is a video of him from the car … Necessary samples have been taken.”Electrek
The driver still needs to be an active participant
There is always an argument that Tesla Autopilot isn’t safe, but the feature helped this situation end without issue? Without Autopilot, this intoxicated driver would have just been navigating the streets without help. He could have seriously hurt himself or someone else.
Of course, there is always the argument that Tesla encourages this behavior with the whole Autopilot situation. But for all intents and purposes, Tesla Autopilot was why this story did not have a tragic ending. On the Autopilot and Full Self-Driving Capability section on Tesla’s website, Autopilot is defined as follows:
“Autopilot is an advanced driver assistance system that enhances safety and convenience behind the wheel…Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver.”Tesla | Autopilot and Full Self-Driving Capability
In conclusion, a drunk driver behind the wheel of a Tesla is not the companies fault. Tesla Autopilot is designed to assist the driver. It does not drive the car on its own. The title of “Full Self-Driving” capability doesn’t exactly make the situation clearer, but Tesla and Elon Musk are working on it.