Skip to main content

Following on the NHTSA’s announcement that they’re investigating Tesla’s recent spate of accidents involving emergency vehicles, two U.S. senators are asking the Federal Trade Commission (FTC) to look into Tesla. Their primary concern? Tesla’s marketing of the Autopilot feature. Senators Ed Markey (D-Massachusetts) and Richard Blumenthal (D-Connecticut) are imploring the FTC to consider whether Tesla is misleading the public into believing that Autopilot is more capable of autonomous operation than it really is. 

A white 2021 Tesla Model S against a white background.
2021 Tesla Model 3 | Tesla

What do the senators want Tesla investigated?

A red 2021 Tesla Model S driving with mountains in the background.
2021 Tesla Model S | Tesla

Both Markey and Blumenthal are concerned that Tesla is marketing their Autopilot and Full Self-Driving features as being more autonomous than they really are. Tesla does include disclaimers that Tesla Autopilot and Full Self-Driving are meant to be used with a driver overseeing their operation. Yet this hasn’t stopped people from misusing it. People keep dying in Teslas with Autopilot and FSD engaged. This raises the question of who’s responsible?

Markey and Blumenthal expressed the belief that Tesla is to blame in a letter to Lina Khan, chair of the FTC. They write, “Tesla’s marketing has repeatedly overstated the capabilities of its vehicles, and these statements increasingly pose a threat to motorists and other users of the road.” 

The senators believe Tesla’s marketing is deceptive

The letter to Khan also goes on to say that Tesla’s marketing is deceptive. It makes people believe that these features are capable of self-driving. The senators point out that “there are no fully autonomous vehicles currently available on the market.”

Yes, Tesla’s disclaimers that a person must be in control of their Tesla at all times. The two senators say that Tesla’s actual marketing campaigns go against this. They point to a YouTube video which shows a Tesla operating alone with Full Self-Driving engaged. Markey and Blumenthal point out that the disclaimers about safely using these features aren’t even readily available for people to see. “While Tesla has buried qualifying disclaimers elsewhere on their website, the link in the video’s caption redirects to a purchasing page that fails to provide additional information about the true capabilities of the vehicle.”

Markey and Blumenthal want Tesla to be more transparent about what Autopilot and FSD can and cannot do. “Understanding these limitations is essential, for when drivers’ expectations exceed their vehicle’s capabilities, serious and fatal accidents can and do result.” 

People continue to misuse Autopilot and Full Self-Driving


A Massive Supercomputer Controls Tesla’s Autopilot

Despite both Tesla’s warnings and stories of fatal accidents involving Autopilot and FSD, people continue to misuse both semi-autonomous features. Both driving systems amount to little more than an advanced cruise control. Still, people believe that a Tesla can safely drive itself. This is not the case, as the number of fatal accidents rising proves. 

In addition to fatal accidents, people keep doing dangerous things with Telsa Autopilot and Full Self-Driving engaged. A man was arrested for sleeping while his Tesla hurtled along at 80 mph. Recently, an intoxicated Tesla driver was found passed out while their car drove. The Tesla pulled over once it realized there wasn’t a coherent driver in control. However, these safeguards aren’t difficult to fool, and plenty of people have done just that. 

Additionally, Teslas have been crashing into emergency vehicles – which often have their lights engaged and signals on. Now the National Highway Traffic Safety Administration is investigating. Evidently Markey and Blumenthal aren’t the only ones concerned with Teslas semi-autonomous features. 

The technological advancements of Autopilot and Full Self-Driving are exciting and could lead to more lives being saved. Yet this is only true if they’re used safely and within the parameters of their capabilities. As Markey and Blumenthal point out, too many Tesla drivers either aren’t aware of what these are, or are choosing to ignore them. And while people are misusing Autopilot and FSD – intentionally or not – the lives of not just Tesla drivers, but everyone on the road, are at risk.