Tesla is moving forward with the release of its “full-self driving” software. Now that the newest version of the driving assistance software prototype is out, reports from owners have grabbed the attention of safety experts. We have seen many serious and even sometimes fatal crashes from drivers using Tesla’s “AutoPilot” feature because they believe it can literally drive for you. All the while, Tesla is harvesting this information from people using the software to work out the kinks. Is Tesla responsible for testing potentially dangerous driving features, or are consumers?
Tesla Is using customers as test subjects
According to Consumer Reports, these reports from users have some safety experts a bit worried. Of course, Consumer Reports is on the case and plans to fully test this new software like when they proved how easy it was to bypass the “safety” features of the previous “AutoPilot” software. CR will be testing the software with the company’s Tesla Model Y.
The FSD beta 9, as it’s popularly known, has been flashing around the internet a bit since its launch last week. CR notes that their experts have been watching these videos posted by Tesla users, and the footage is alarming. The concerns come from watching Teslas scrapping bushes, missing turns, and driving toward parked cars, only to name a few. Even the (self-appointed) Technoking himself, Elon Musk tweeted, “there will be unknown issues, so please be paranoid.”
Should Tesla be allowed to call the FSD beta 9 software “Full-Self Driving” software?
This question has been rattling around the automotive world for a little while now. As the death toll mounts of drivers over-trusting the Tesla “AutoPilot” setting, multiple safety organizations have opened investigations on this software and if it’s fit for the road or not.
Even though this new beta feature is called “Full-Self Driving,” the truth is, it just isn’t accurate. No matter what people say, Teslas cannot drive themselves. This feature’s misleading name leads people to believe their Tesla can autonomously drive places, but it cannot. Granted, Tesla’s semi-autonomous driver assists are really cool, but they don’t make up for the fact that the car still very much requires a thinking, sentient being to operate it safely.
These features are cool but should consumers be testing them on public roads?
While no one can argue that the Tesla Full-Self Driving mode is pushing the boundaries in many ways, this doesn’t mean that public roads should be Tesla’s testing grounds with customers behind the wheel.
“Videos of FSD beta 9 in action don’t show a system that makes driving safer or even less stressful,” says Jake Fisher, senior director of CR’s Auto Test Center. “Consumers are simply paying to be test engineers for developing technology without adequate safety protection.”
It’s one thing for someone to trust Elon Musk and his electric cars to drive them around; it is a whole other thing to subject non-willing participants to take part in testing that potentially lethal sci-fi fantasy.
CR safety experts agree that Tesla using paying customers and non-consenting bystanders to test this new driving feature is quite dangerous and possibly unethical.
Bryan Reimer, a professor at MIT and founder of the Advanced Vehicle Technology (AVT) consortium, a group that researches vehicle automation, told Consumer Reports that “while drivers may have some awareness of the increased risk that they are assuming, other road users—drivers, pedestrians, cyclists, etc.—are unaware that they are in the presence of a test vehicle and have not consented to take on this risk.”
What does Tesla have to say?
Tesla remains slow to comment on these issues. However, the EV giant has been clear that Tesla drivers need to pay attention when using the Tesla “AutoPilot” feature. However, Tesla simply asking people to “pay attention” is not nearly enough for the consequences at play.
At this point, we understand quite well the price that inevitably gets paid from the unwashed masses using/testing this sort of new, unrefined technology. As Elon Musk has said repeatedly, he is Tesla, yet he acts as if he is powerless to stop the “self-driving” Tesla deaths. In fact, he continues to release newer, more emboldened versions with increasingly misleading names. We will just have to see what calamities result from this new, knowingly flawed feature.