Skip to main content
An IIHS tester at the wheel of a Tesla with an advanced driver-assistance safety suite

IIHS Gets Serious About Alleged Self-Driving Cars’ Safety Ratings

Starting this year, the IIHS won't just be issuing warnings about so-called 'self-driving cars.' It's now going to score partially-autonomous technology the same way it rates other car safety features. And in doing so, it hopes to make these systems more foolproof in the real world.

It’s the IIHS’s duty to monitor, inform, and advise consumers about the current state of car safety. And lately, that means more than just rating headlights and running crash tests. Nowadays, the IIHS, as well as the NHTSA, also has to deal with autonomous vehicles, aka ‘self-driving cars,’ causing chaos. Now, though, it’s going further than just issuing warnings.

The IIHS will start giving ‘self-driving cars’ official safety scores

An IIHS tester at the wheel of a Tesla with an advanced driver-assistance safety suite
IIHS tests a Tesla advanced driver-assistance safety suite | IIHS

Because the IIHS isn’t a government agency, it can’t set legal car safety standards. That’s what the NHTSA does. However, it can guide the state of safety tech with its rating system, which in some areas is arguably ahead of the NHTSA’s system. And starting this year, that rating system will include partially-autonomous cars.

This new ‘self-driving car’ system is scored much like the other safety areas that the IIHS tests. After testing, systems earn one of four scores: Good, Acceptable, Marginal, or Poor. To earn a ‘Good’ score, the partial-autonomy technology must keep the driver’s eyes on the road and their hands either on the wheel or always read to grab it. Also, it shouldn’t work if the driver disables any features and/or doesn’t fasten their seatbelt.

In addition, if the driver’s attention drifts, the car must issue “escalating alerts” to get them back on track. Think warning chimes, a pulsing steering wheel, etc. And if that’s not enough to refocus the driver’s attention, the car must implement “appropriate emergency procedures.” In the IIHS’s mind, that means coming to a halt, contacting the manufacturer, and locking the driver out of the ‘self-driving’ functions until the car’s engine is cycled off and on.

As of this writing, the IIHS is still designing the testing procedures for this new safety initiative. But it expects the first results to come sometime later this year. It’s unclear, though, if these scores will factor into its Top Safety Pick program.

No matter what the OEMs claim, no partially-autonomous car is self-driving right now

Considering the recent inrush of crashes involving ‘self-driving cars,’ the IIHS’s new safety initiative arguably doesn’t come soon enough. And the agency acknowledges that. Right now, no automaker’s partial-autonomy technology merits a ‘Good’ score. Not Honda’s Japan-only Level 3 Legend, not GM’s SuperCruise, and especially not Tesla’s Autopilot or ‘Full Self-Driving’ suite.

Admittedly, Tesla alone isn’t the sole reason for the IIHS’s new safety scores, but it plays a significant role, The Drive says. Not only does FSD not grant full autonomy, but it’s downright dangerous in the real world. Plus, the IIHS specifically calls out poor partial-autonomy suites for not having the driver initiate or at least confirm lane changes. That’s something FSD and Autopilot often do.

It also doesn’t help matters that Tesla is notorious for over-hyping its technology’s capabilities. This inflates owners’ confidence and further heightens the danger to everyone else because the drivers don’t know the tech’s limitations. They believe their cars are self-driving when really they have advanced driver-assistance features.

Technology doesn’t replace proper training, care, and attention

That’s the real goal of this new safety scoring system, the IIHS claims. “’The way many of these systems operate gives people the impression that they’re capable of doing more than they really are,’” IIHS Research Scientist Alexandra Mueller says. “’But even when drivers understand the limitations of partial automation, their minds can still wander.’” These new scores aren’t about testing self-driving cars’ hardware, but how well they deal with inevitable human behavior and “abuse.”

“As humans,’” Mueller notes, “’it’s harder for us to remain vigilant when we’re watching and waiting for a problem to occur than it is when we’re doing all the driving ourselves.”’ That’s why a hypothetical ‘Good’ score requires so many safeguards. It’s also why advanced safety features, though they can reduce accident rates, don’t eliminate them entirely. It doesn’t matter how advanced your driver-assistance system is if the driver doesn’t know how to use it or deliberately messes with it. And as of right now, the latter is too easy to do.

For the foreseeable future, there won’t be any truly self-driving cars on the road. And pretending like partially-autonomous cars are self-driving hurts everyone, from drivers to pedestrians to the automakers. But from the sounds of things, this new IIHS safety scoring initiative could educate owners and force OEMs into improving their tech’s fail-safes. That’s a safety win on all accounts.

Follow more updates from MotorBiscuit on our Facebook page.

Related

A Tesla Used for a Self-Driving Autopilot Promotional Video Crashed During Filming