Tesla Autopilot is a well-known feature, but is it well-understood? While the name of Tesla’s semi-autonomous driving system makes it sound as though Autopilot can self-drive, the truth is far from it. Yet this hasn’t kept people from crashing their Teslas – and into emergency vehicles. Tesla Autopilot accidents are persistent, and now the NHTSA is investigating.
The technology behind Tesla is far from perfect
Although it’s called Tesla Autopilot, it should really be something like Tesla Semi-Autonomous. The Tesla Autopilot function is designed to allow drivers an increased amount of comfort and safety, but it doesn’t null the responsibilities of operating a vehicle. Tesla Autopilot users are still supposed to keep their hands on the wheel at all times, just like when driving a regular car.
In order for Tesla Autopilot to work, the feature relies on a camera-based system to monitor the road and area around the vehicle. This camera-based system replaces the radar sensors that Tesla used previously. Now Tesla’s Autopilot and Full Self-Driving features rely solely on the camera-based system. This operates on a neural network, which means that each Tesla communicates with the network of Teslas, sharing information that’s designed to improve the effectiveness of all Teslas.
These recurrent accidents beg the question; how effective is Tesla’s camera-based system? And how effective were the radar sensors that came before it?
Tesla Autopilot accidents often involve emergency vehicles
There have been eleven accidents to date involving Tesla’s Autopilot function and emergency vehicles. In some of the accidents, the Teslas have crashed directly into emergency vehicles. Many of them were already in the midst of dealing with another accident or incident.
Slate talked to Raj Rajkumar from Carnegie Mellon. He’s ann electrical and computer engineering professor who specializes in self-driving cars. He explained that these accidents might be happening because the emergency vehicles that the Teslas are crashing into are stationary.
When Tesla was operating with radar sensors, those sensors sent out electromagnetic waves, which hit any objects around them and then come back. The Doppler effect means that the frequency of these waves changes based on the way the objects around the Tesla are moving. If they’re stationary, they’re not as easy to detect. Additionally, the Tesla is getting input from other stationary objects like buildings and the road, and this can be confusing to the Tesla.
Rajkumar notes that newer generations of radar do a better job at distinguishing between something like the road and a stopped ambulance.
The camera-based system has its own problems
Tesla’s new camera-based system isn’t necessarily better than the radar system before it. That neural network mentioned before has some serious limitations, especially early on in its adaptation. The camera-based system works by interpreting the pixels of objects around them. The neural network is learning what all of these pixels mean; it’s learning patterns. When the Tesla encounters something it isn’t familiar with, it doesn’t register that as an object to beware of.
Rajkumar uses the example of a fatal accident in Florida involving Autopilot. Because the truck was perpendicular to the Tesla, rather than facing the same direction as the Tesla, the Tesla didn’t recognize this. It determined that there was no object and struck the truck.
Rajkumar says that when this camera-based system is encountering emergency vehicles with their lights flashing, it doesn’t register this as something it’s seen before, and it determines that there is no object there. And obviously it’s then striking those vehicles.
Tesla Autopilot accidents are obviously concerning, and the fact that they’re crashing into emergency vehicles is an added problem. The NHTSA makes it clear that Tesla’s Autopilot function is not autonomous. It is important that Tesla operators maintain control of their vehicles at all times. Hopefully the NHTSA’s investigation will be swift and the problem will be easy to resolve. Ideally no one else will be injured or killed before it gets resolved.