The U.S. National Highway Traffic Safety Administration (NHTSA) announced on Tuesday that vehicle manufacturers must report all incidents in which driver-assistance and self-driving systems are involved in a serious accident.
As reported by the Washington Post, the NHTSA said that the new requirements will apply to all systems it defines as Level 2 to Level 5 automation. The NHTSA considers Level 0 automation to be none whatsoever and Level 1 to be vehicles operated by humans where “some driving assist features may be included.”
Level 2 is any driver assistance system that “has combined automated function, like acceleration and steering, but the driver must remain engaged,” while Levels 3-5 require less human operation on a progressive scale. Under the new rules, accidents involving hospitalisation or death, a bicyclist or pedestrian, a tow truck, or airbags going off must be reported within one day of the automaker learning of it, while other crashes must be reported monthly.
That means virtually all of the advanced self-driving and driver-assist technologies under development by companies like Tesla, Alphabet-owned Waymo, Aurora, and major auto manufacturers such as Toyota will be subject to the new requirements. Currently, statistics on just how dangerous (or safe) automated and advanced assistance driving may be are hard to come by.
Level 3 to 5 vehicles are not commercially available but various manufacturers are testing them. While historically crash data has been stored in onboard black boxes, vehicles are increasingly connected to the internet, and data on assisted or autonomous driving might be beamed over the air exclusively to manufacturers’ databases, as in the case of Tesla.
In a news release, the NHTSA wrote that reporting requirements will apply to all incidents where a Level 2 or above “system was engaged during or immediately before the crash,” which it added might show “common patterns in driverless vehicle crashes or systematic problems in operation.” Acting Administrator Steven Cliff characterised the decision in the release as potentially good news for manufacturers, as it could “help instill public confidence” that the federal government is ensuring autonomous systems are safe on the nation’s roads.
The feds have, in the past, largely left the nascent sector to regulate itself, leaving autonomous vehicle regulation largely to states. Since June 2016, NHTSA reports sending special investigation teams to 31 crashes involving partially automated driver-assist systems, according to Al Jazeera.
Tesla’s Autopilot feature, which is Level 2, has attracted particular scrutiny after a number of Tesla owners got into accidents while the feature was enabled, perhaps under the impression that AutoPilot is far more advanced than it really is courtesy of the company’s marketing. According to the Post, the NHTSA is aware of more than two dozen accidents involving Autopilot, including one in which a Tesla driver died after slamming into a truck. As the Verge noted, there have been at least 11 deaths in nine crashes involving Tesla Autopilot since 2015 and another nine deaths in 11 accidents across the globe.
Uber abandoned its self-driving ambitions after an infamous 2018 incident in which a car operated by a distracted worker struck and killed Elaine Herzberg in Arizona. Waymo, the Alphabet subsidiary, released data last year showing that over the course of 6.1 million autonomous miles driven in Phoenix, Arizona, its prototype cars were involved in 18 accidents involving collisions and 29 separate incidents that required human intervention, probably to avoid one.
For comparison, the NHTSA reported 84 injured persons per 100 million vehicle miles nationally in 2019. The RAND Corporation published a study in 2016 arguing it would take data from hundreds of millions to billions of miles travelled to determine how many injuries and fatalities might be caused by driverless vehicles versus standard ones.
The NHTSA is also involving accidents involving partially automated Lexus, Volvo, and Cadillac models, Al Jazeera reported, as well as a Navya Arma low-speed shuttle.
“The agency has apparently finally heard the Centre for Auto Safety’s long-standing call for the federal government to engage in oversight of the unregulated technology currently being tested on America’s roads with neither a warning to the residents, nor any data being collected,” Centre for Auto Safety executive director Jason Levine told Reuters.