Skip to main content

Tesla has a lot invested in self-driving, autonomous vehicles. But getting to the Cybercab and the latest versions of the Full Self-Driving (FSD) suites hasn’t been the smoothest road. Like the case of a Tesla Model S striking a parked car and killing a pedestrian while on Autopilot. It’s the subject of an ongoing legal case, even though the driver was looking down to recover his phone at the time. 

Tesla Autopilot is on trial for a crash that cost the life of a pedestrian– even though the driver ‘looked down’ to get his phone

George McGee was driving his then-new 2019 Tesla Model S on Autopilot when he dropped his cell phone. McGee reached down to the floorboard to recover the phone just as his Model S drove into a “T” intersection. By the time McGee looked back up, the Autopilot-enabled Model S was on a collision course with a parked Chevrolet Tahoe. 

While there was no occupant in the stationary Tahoe, 22-year-old Naibel Benavides Leon and another pedestrian were on the other side of the SUV. Unfortunately, the Model S pushed the Tahoe into the two pedestrians, killing Naibel Benavides Leon and seriously injuring the other person.

The lawsuit following Naibel Benavides Leon’s death asserts that Tesla ‘failed to warn the driver about the Autopilot system’

While tragic, this would be fairly straightforward with just about any other car. However, the Tesla Autopilot of it all complicates things. McGee activated Autopilot before he reached the intersection, including a 45-mph speed limitation, per Car Complaints.

Here’s the kicker: McGee applied the accelerator between activating Autopilot and crossing into the intersection. That solitary move deactivated the 45-mph restriction. But, as the lawsuit says, his throttle application should have left other Autopilot features active.  

McGee’s Model S “detected a stop sign, a stop bar, the road’s edge, a pedestrian, and a parked Chevrolet Tahoe,” before the crash. He even owned up to his involvement in the crash, saying “there was nothing that prevented him from acting to prevent the crash.” 

A judge claims that Autopilot can cause drivers to ‘become complacent and over-rely’ on the function

Judge Beth Bloom ultimately allowed the lawsuit to go to trial with a jury. The reason? Judge Bloom asserts that the Tesla driver likely didn’t know that he shouldn’t have relied on Autopilot to the extent that he did.

Moreover, Judge Bloom objected to the term Autopilot itself. “The term ‘Autopilot’ itself is a potentially confusing term as it arguably implies to the user that the 2019 Model S was more autonomous or at least had more capabilities than it really did.”

Ultimately, Judge Bloom found that Tesla drivers may be inclined to “become complacent and over-rely on Autopilot to operate their vehicles. A reasonable jury could find that Tesla acted in reckless disregard of human life for the sake of developing their product and maximizing profit,” Judge Bloom said of the case.

Related

The Best Stations Wagons Available Are The Only Wagons

Want more news like this? Add MotorBiscuit as a preferred source on Google!
Preferred sources are prioritized in Top Stories, ensuring you never miss any of our editorial team's hard work.
Add as preferred source on Google