Self-Driving Car Accidents: Who Is Responsible for Fatalities?
As the demand for eco-friendly cars rises, more auto companies are prioritizing CASE values (connectivity, autonomous, sharing/subscription, electrification) for new vehicles. Toyota and Subaru were among the first companies to implement electric cars into their lineups. But now U.S. automakers are starting to expand their EV lineups.
An all-electric version of the popular Ford F-150 is expected to release in the near future following its powerful demonstration videos. However, even the most advanced technology is also prone to error. Autonomous vehicles have caused a few fatalities this year. Although this doesn’t mean all self-driving cars are dangerous, it does cause automakers to reevaluate their technology to avoid future accidents.
Uber’s self-driving car fatality
The first pedestrian fatality was caused by a self-driving Uber car on March 18, 2018. Elaine Herzberg was walking her bike outside of a crosswalk late at night when the car approached. Even though an emergency driver was behind the wheel, the car couldn’t be stopped. Herzberg passed away.
This was not the first incident of Uber testing self-driving cars on public roads. The cars had caused minor traffic violations and fender benders in the past. After Herzberg’s death, the company stopped its real-world testing to investigate. Uber found that the incident was caused by a software feature that ignored objects on the road that it didn’t ascertain as hazardous to the driver.
Tesla’s self-driving problems
According to Tesla, two people have died as a result of driving with the Autopilot feature engaged since 2016 — one accident for every 2.91 million miles driven in 2018.
With Autopilot enabled, Tesla drivers can supposedly pay less attention to the road since the car can automatically change lanes, center itself within the lane, and adhere to speed limits. However, the technology is not perfect. One driver reported that his car swerved into an exit lane by accident and he could not take control because the Autopilot locked the steering wheel.
Who’s to blame?
Many factors are at play when it comes to determining who is liable for these accidents. With these cases, the fault lies with the underdeveloped technology used to control it. Automakers should not allow their cars to be driven by consumers or testers in real-world settings without thoroughly testing them in controlled settings.
Some argue that Uber’s backup driver was at fault. The vehicle in question was of Level 4 autonomy, which means that the human driver can request control at any time. If this were the case, the backup driver should’ve been paying attention to the road. However, since the incident occurred at night and the pedestrian was walking outside of the crosswalk, one could say that limited visibility was a factor in Herzberg’s death.
Despite the enthusiasm for autonomous vehicles, it’s more important that automakers make sure the self-driving technology is developed and drivers can easily take control in the event of an emergency. Autonomous and electric cars are the future of the auto industry. But a bad reputation due to repeated accidents and hardware failure could make it difficult to market these vehicles.