Skip to main content

Another Tesla driver’s death has happened again. In late February, Terry L. Siegal, 74, was driving his Tesla Model 3 in Independence, Missouri, when his Tesla appeared to stop in the middle of the highway. Siegal died after his car was hit by two other cars coming down the highway. Thankfully, his passenger and the other two drivers left with only minor injuries. Is this another Tesla Autopilot mishap or something else? 

Tesla Model 3s lined up in front of a dealership
Tesla Model 3s | Ding Ting/Xinhua via Getty

Why did his Tesla Model 3 stop on the highway? 

According to Carscoops, authorities are still investigating the cause of the crash, including why the vehicle was stopped. The tragic incident happened on I-70 just east of Kansas City. According to local authorities, the Tesla came to a complete stop in the middle of the road, and two approaching vehicles failed to avoid it. 

Although this may trigger autopilot fears, the police said that “the initial investigation indicates that Terry and his passenger were in a Tesla when a mechanical issue caused the vehicle to lose power and stop in the roadway. As a result, approaching vehicles were unable to avoid colliding with them. It appears that everyone involved was wearing a seatbelt. The crash remains under investigation.” 

What would make the Tesla break down? 

Carscoops says that soon after the initial police statement, the police walked back some of what they said. The amendment was that they now say they don’t know why the Tesla Model 3 stopped. As the investigation continues, investigators will search for any telemetry data to explain the sudden stop. 

Carscoops also mentioned that the police haven’t released the passenger’s identity in Tesla Model 3, who survived. This person’s account of the horrific accident may have been what made the authorities change their statement. 

What is this Tesla “phantom braking”

Phantom braking is what happens when the developers do not set the decision threshold properly for deciding when something is there versus a false alarm,” Phil Koopman, a Carnegie Mellon University professor who focuses on autonomous vehicle safety, told The Washington Post earlier this month. “What other companies do is they use multiple different sensors, and they cross-check between them – not only multiple cameras but multiple types of sensors.”

The NHTSA recently, but before this Model 3 accident,  launched an investigation into Tesla cars yet again, after numerous Tesla owners reported this “phantom braking.” Reports say that Teslas have been randomly and aggressively braking at highway speeds. The danger here is obvious.

The Tesla owners’ complainants allege that while using advanced driver assistance features like adaptive cruise control, the vehicle unexpectedly slams the brakes when it is driving at highway speeds. They report that the braking events happen sharply and, seemingly, without cause. Worse still, this can happen many times in a single drive, despite the recent recall.

This investigation found that as many as 400,000 or more Teslas might be at risk of phantom braking, too. So far, the NHTSA has received 354 complaints of this phantom braking over the last nine months. What’s worse is the reports seem to be coming in at a higher rate. Over the last three months alone, 107 complaints came in about phantom braking. Owners call this phantom braking “hair raising” and began expressing a lack of trust in Tesla’s advanced driving systems. 

Are Teslas safe? 

As far as normal car testing and safety equipment are concerned, yes, Teslas are plenty safe. However, the “autonomous driving” mode and even cruise control have been tied to numerous deaths in the past year. 

Many people seem eager to trust a car’s programming with their lives and the lives of others, but how many times can our cars “decide” to take action that leads to a driver’s death before we slow our roll with this tech?

Related

Tesla Model 3 Blows Demonstration After Violently Swirving at Biker During New Tesla ‘Full Self-Driving’ Safety Video