Tesla released their Full Self-Driving months ago, but that doesn’t mean that everyone with a Tesla has it. Initially it was available only to a small group of people, mainly Tesla employees. Now though, Tesla FSD beta will be available to the public in just four short weeks, approximately the end of September 2021. What Will this mean for Tesla’s Full Self-Driving testing, as well as everyone else on the road?
A small number of people have been testing Tesla Full Self-Driving for months
The number of people testing Tesla’s Full Self-Driving system has been slowly growing over the last months. Now that Tesla is about to release FSD 10.0, they’re talking about releasing it to the wider public.
Tesla owners have been waiting for the wider release for some time now. As is typical with Tesla, answers about when FSD would be given a more general public release haven’t been exactly forthcoming. Even now, Musk’s determination that it would possibly be available to download in four weeks is far from a certainty.
In a tweet reply to someone asking about the FSD public release timeline, Musk said, “We should be there with Beta 10, which goes out a week from Friday (no point release this week). It will have a completely retrained NN, so will need another few weeks after that for tuning & bug fixes. Best guess is a public beta button in ~4 weeks.”
What is Tesla’s Full Self-Driving like?
Tesla’s FSD is a semi-autonomous driving system that is supposed to help a driver navigate their car. While the name makes it sound autonomous, it is actually only semi-autonomous, meaning drivers are supposed to retain control of the car at all times.
Recently, Tesla ditched radar sensors in the Model 3 and Model Y cars. Instead, these Teslas and their safety systems, as well as Autopilot, are operating via a camera-based system. This camera based-system relies on a neural network to share information about its effectiveness. The IIHS recently tested the Model 3 with the camera based safety features and found it worthy of its Top Safety Pick+ award.
There are still problems with Tesla’s FSD
There have been eleven accidents involving Tesla’s semi-autonomous driving systems and emergency vehicles. Slate talked to Raj Rajkumar from Carnegie Mellon. He’s an electrical and computer engineering professor who specializes in self-driving cars. He explained that these accidents might happen because the emergency vehicles are stopped.
When Tesla was operating with radar sensors, those sensors sent out electromagnetic waves, which hit any objects around them and then come back. The Doppler effect means that the frequency of these waves changes based on the way the objects around the Tesla are moving. If they’re stationary, they’re not as easy to detect. Additionally, the Tesla is getting input from other stationary objects like buildings and the road, and this can be confusing to the Tesla.
If Tesla’s Full Self-Driving is really released to the public in four weeks, it will be interesting to see how the average Tesla driver does with the software. Until the software is perfected, or at least a whole lot better, things might be dicey for a while. We’ll have to see how the rollout goes, and how Tesla drivers handle their new power.