Skip to main content

The accident happened in February 2023, just after 4 a.m. Giovanni Mendoza Martinez, 31, slammed his 2014 Tesla Model S into a fire truck parked diagonally across two lanes. Giovanni died, and his brother, riding as a passenger, was injured.

Now, the driver’s Contra Costa County family and the city’s insurance provider are both taking Tesla to court.

Martinez had engaged Tesla’s Autopilot roughly 12 minutes before the collision

Both the family and Public Risk Innovation, Solutions, and Management claim that Martinez believed that the Tesla could drive itself.

The city’s ladder rig was assisting first responders at a previous car accident. The EV was moving at about 71 mph when it hit the fire truck.

According to CarComplaints.com, the lawsuits argue that Tesla misrepresented what its Autopilot system could safely handle.

Data shows Giovanni didn’t touch the pedals during the period Autopilot was engaged, and he only “generally” kept contact with the steering wheel. It’s vague phrasing mirrored in both lawsuits.

The insurer points to a “key flaw” in Tesla’s Autopilot

The company argues that Tesla’s camera system processes single frames. This makes it unable to reliably distinguish emergency vehicles with flashing lights from normal traffic.

Had Autopilot been capable, it says, the collision might have been avoided.

Tesla is pushing back, citing repeated warnings for drivers to stay alert and keep their hands on the wheel. The EV maker also notes that its manuals clearly state drivers remain responsible for the vehicle at all times.

We reported on the National Highway Traffic Safety Administration opening a fresh investigation on Tesla’s Autopilot in April 2024, questioning Tesla’s earlier recall remedy for errors in the software’s realtime function.

The NHTSA found numerous accidents were avoidable, with hazards often visible seconds before impact.

Tesla issued over-the-air updates meant to better enforce driver attention, but adoption is optional, and collisions continued, highlighting persistent safety gaps.

In response to repeated criticism and the ongoing wave of preventable accidents leaving families devastated, Tesla also rebranded its Full Self-Driving system, adding “(Supervised)” to the name.

Even advanced driver-assistance systems are far from replacing human attention

And legal accountability can quickly get messy when the limits of automation collide with drivers who don’t fully grasp those limitations.

This case is ongoing in U.S. District Court for the Northern District of California, with the family and insurer seeking to hold Tesla responsible not only for the damage and loss but also for what they claim was misleading marketing of Autopilot’s capabilities.

Want more news like this? Add MotorBiscuit as a preferred source on Google!
Preferred sources are prioritized in Top Stories, ensuring you never miss any of our editorial team's hard work.
Add as preferred source on Google