Last week Tesla was put on notice that it was being watched very carefully by the National Highway Transportation Safety Administration. The NHTSA even said it was poised to help in case of safety risks over Tesla’s latest software update. So, the feds are ready to pull the plug on Tesla over the automation. What’s got the NHTSA wringing its hands?
Last week Tesla began rolling out its full self-driving beta testing to a percentage of customers. They were chosen based on their safety records. These owners will be testing the software. But, because it incorporates Tesla’s advanced driver-assist technology it represents the potential for risks on open roads.
Tesla is concerned enough that it will be monitoring each owner’s car
For its part, Tesla is concerned enough that it will be monitoring each owner’s car with the Autopilot software. Internally it is called “FSD.” It will be limited to lower speed limits which means it is defeated once a driver increases his or her speed to that limit. So no testing will happen on highways. Once the FSD software has been proven to be safe it will be updated for higher speeds. The potential for fatalities is substantially lowered with the cars limited to local roads.
According to Reuters Tesla says, “NHTSA has been briefed on Tesla’s new feature, which represents an expansion of its existing driver assistance system,” says the official statement. “The agency will monitor the new technology closely and will not hesitate to take action to protect (the) public against unreasonable risks to safety.”
It appears the update will only remain in Beta form until the end of the year when Tesla founder Elon Musk says it will be released to everyone. He told investors that the more data that Tesla can analyze in Beta form the safer the software will be once it is widely released. But there has been some drama in the last few months causing the NHTSA concerns.
19 accidents are attributed to Tesla drivers who crashed with Autopilot automation
Back in July, it announced that there were 19 accidents attributed to Tesla drivers who engaged Autopilot when the crash occurred. Tesla did not participate in earlier investigations of this same issue. In December a man engaged Autopilot in his Tesla to check on his dog in the back seat. While he was facing backward his Tesla rammed into a police car parked next to the highway.
No one was injured but obviously, this could have been disastrous. It was apparent that the driver did not understand what Autopilot does and doesn’t do. Three competitors have formed Partners for Automated Vehicle Education, or PAVE. Ford, GM, and Google/Waymo argue that so early a rollout is dangerous.
“Public road testing is a serious responsibility and using untrained consumers to validate beta-level software on public roads is dangerous and inconsistent with existing guidance and industry norms,” PAVE said. So it appears beta-level software has been in certain Tesla owner’s hands for a while. As of now, there have been no accidents reported.