Earlier this week we told you about the fatal Tesla crash where two people perished. But neither of them was driving the car. One person was found in the passenger seat and the other person in the back seat. Suspicions are that the Tesla Autopilot was involved. The reality is no one can figure out how this Tesla crash happened.
Tesla’s AutoPilot crash is being blamed for giving a false impression that if the software is installed your Tesla will drive itself. But now Consumer Reports shows how easy it is to make your Tesla driverless. Nobody is recommending you do this. After all, you don’t want to become a terrible headline.
Are there adequate safeguards For Tesla’s AutoPilot?
Are there adequate safeguards that don’t allow someone to try and turn their Tesla into a driverless vehicle? Apparently not. Right from the start, Tesla warns drivers not to divert their attention from what is happening beyond the windshield.
But the folks at CR decided to see if they could trick the AutoPilot system into driving without a driver. On a closed-course track in Connecticut, they tried to rig a Tesla Model Y to go without anyone in the driver’s seat.
CR started by engaging the AutoPilot while on the track. Then, they set the speed dial to zero which brings the car to a stop. The next step was to hang a small weight onto the steering wheel.
Tricking the Autopilot software into sensing there is a hand on the wheel
This tricks the software into sensing there is a hand on the wheel. After that, the driver moved to the passenger seat without opening any doors. Opening a door disengages the AutoPilot. The driver’s seat belt was then buckled.
From the passenger seat, the same steering wheel dial was turned to initiate acceleration. The speed can be controlled by the dial. Stopping the car requires you to manually return the dial to zero.
“The car drove up and down the half-mile lane of our track, repeatedly, never noting that no one was in the driver’s seat, never noting that there was no one touching the steering wheel, never noting there was no weight on the seat,” driver and CR Senior Director Jake Fisher said. “It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.”
Well, not so fast!
Well, not so fast. CR says that since Tesla’s AutoPilot does not use eye-tracking as does the GM Super Cruise or Ford BlueCruise, it is easy to fool the system. All of the steps to trick the software requires you to have a better knowledge of how to defeat AutoPilot than most Tesla drivers. (No offense intended to Tesla owners-please, no angry comments. We love you all!)
The other thing is that each step in tricking the AutoPilot system is something that would be considered a safeguard. So, actually, there were a number of safeguards built into AutoPilot before it could be tricked into going driverless.
CR places the blame at the feet of Tesla
Unfortunately, CR places the blame for the Tesla crash at the feet of Tesla’s AutoPilot. We don’t know whether that is quite fair seeing the elaborate procedures necessary to go driverless. And besides, in the fake driverless condition acceleration must be controlled by the driver sitting in the passenger seat.
Still, it is interesting to watch the process and see that there is a certain capability for the Model Y to be driven without an actual driver.