“Dude, where’s my car?”
“It just drove itself away and crashed under a trailer while you were inside the store.”
In a nutshell, that’s the story a Utah man told KSL after coming outside to find his Tesla Model S got into an accident. After a quick review of the incident, Tesla put the blame on the owner’s shoulders, citing the vehicle log as evidence of negligence prior to the crash. Yet the negligence of human beings is the reason semi-autonomous driving will be dangerous during this transition period. The robots aren’t ready to take over, and we’re not ready to go halfway.
This episode should make even the most-stiffed face person chuckle, but a car operating against the wishes of the driver is disturbing news for anyone who occasionally walks down the street. According to the report, the driver had left his Model S in a parking spot unaware it was ready to activate on its own. At that point, a signal went to the controller in the driver’s hand alerting him it was about to “Summon” on its own and look for a parking spot.
Since the driver had no clue it was happening, he missed his chance to stop the car as it crossed into auto-park mode. Tesla’s response highlighted the importance of drivers staying on point while their vehicles operate in semi-autonomous mode, while allowing that “Summon” was still in Beta testing.
“It is paramount that our customers also exercise safe behavior when using our vehicles — including remaining alert and present when using the car’s autonomous features,” a spokesperson wrote in a statement to KSL. But there are too many gray areas involved.
Self-parking is theoretically one of the safest of autonomous drive features. You let it do its work on private property (i.e., your driveway or garage) while you keep the button in hand to stop it should anything go wrong. Judging by the alerts the Utah driver got prior to his crash, there were opportunities to avoid such a bizarre fate. But it still could have done serious harm if sensors were unable to detect a person in the vicinity. (This can happen when something is below the fascia.)
Tesla Autopilot features are designed to operate on the highway after the vehicle has established it has a grasp of the surroundings. Driver participation is mandatory, as there is no guarantee the system is foolproof. When you have an opportunity for human error to have disastrous consequences, the benefits of autonomous technology are overruled.
Some automakers, including Ford, are choosing to sit out the semi-autonomous driving period (Level 3) and work on technology that will remove drivers from the equation entirely (Level 4). Google shares this approach. If you’re going to need humans part of the time, that’s enough to put everyone on edge all the time. That’s the scary part about this transitional period prior to full autonomy.
Connect with Eric on Twitter @EricSchaalNY