Report: Nearly 400 crashes by ‘self-driving’ cars in the US

Data collected by a US regulatory agency will allow for greater transparency on safety of semiautonomous vehicles

Automated
A Waymo minivan drives passengers during an autonomous vehicle ride in Chandler, Arizona [File: Ross D Franklin/AP]

US car manufacturers reported nearly 400 crashes involving cars with partially autonomous driver assistance systems, according to a new report from a US car-safety regulator released on Wednesday.

Tesla, which has about 830,000 vehicles on the road with driver-assist programmes that have partial control over speed and steering, reported 273 crashes, about 70 percent of the total, according to The Associated Press. Companies caution that drivers must remain prepared to intervene and take control of driving at all times, even in cars with partially autonomous systems.

The National Highway Traffic Safety Administration (NHTSA) collected reports of such crashes from manufacturers from July 2021 through May 2022, the first broader report of its kind. The NHTSA said the report provided “crucial data necessary for research and for the development of policies to enhance the safety of these technologies”.

The report brings new data to a debate in the United States over the safety of “self-driving” cars and the appropriate regulatory approach to such vehicles, which polling shows Americans still harbour scepticism towards. A 2021 poll by the Pew Research Center noted that 63 percent of US adults would not want to ride in a driverless vehicle, while 37 percent said they would.

The new data will give the public a greater ability to track the safety of partially autonomous vehicles, Michael Brooks, acting director of the watchdog group Center for Auto Safety, told Al Jazeera in a phone call.

“The real story here is that this data will be updated monthly, and the public now has a way to monitor the safety of these vehicles,” said Brooks. “This is a dataset the NHTSA can draw from in the future for enforcement actions or crafting new regulations.”

Industry groups representing the interests of car manufacturers stated the data is insufficient to point towards clear conclusions around the use of partially autonomous vehicles, according to the AP.

Twelve car companies reported crashes involving vehicles with partially autonomous driver assistance to the NHTSA, with Honda reporting 90 crashes, the second most after Tesla.

The NHTSA required manufacturers to report crashes if partially automated systems were operating up to 30 seconds before the crash, and included incidents such as airbag deployment, pedestrian or cyclist collisions, or hospitalisation of a victim.

However, the report also noted that the ability of a manufacturer to receive real-time data on an incident can vary, and can sometimes go unreported altogether. Tesla, for example, uses telematics to receive information on such incidents, which may help explain the large portion of reported crashes attributable to Tesla.

“​​Due to variation in data recording and telemetry capabilities, The Summary Incident Report Data should not be assumed to be statistically representative of all crashes,” said the report.

US carmakers have looked to autonomous vehicles with increasing interest, with Tesla’s Elon Musk recently noting that self-driving is “really the difference between Tesla being worth a lot of money or worth basically zero”.

Tech companies such as Apple and Google have also been eager to stake their claim. According to the AP, Waymo, Google’s autonomous vehicle unit, operates 700 self-driving vehicles, and is testing an autonomous ride-hailing programme in Arizona.

“Test programmes for truly self-driving cars are relatively small. But there are millions of cars on the road with these semiautonomous crash avoidance systems, and they haven’t been subject to much regulation,” said Brooks with the Center for Auto Safety.

Critics of those programmes have alleged that companies are pushing too far too fast, prioritising progress on autonomous driving technology over public safety. Tesla was pushed to suspend an “assertive” self-driving mode earlier this year that allowed its cars to roll through stop signs without coming to a complete stop.

Debates have raged about the appropriate level of oversight for such programmes, and videos showing self-driving cars making dangerous mistakes, such as driving into oncoming traffic and veering into metal poles, have alarmed regulators.

While Tesla has promoted its cars as “self-driving”, the company has managed to skirt certain regulatory requirements in states such as California by arguing that since their systems still require driver intervention, they cannot be defined as “autonomous.”

The California Department of Motor Vehicles (DMV) had accepted that argument. But in a letter responding to questions from state Senator Lena Gonzalez, chair of the California Senate Committee on Transportation, the DMV stated that they were revisiting that decision “following recent software updates, videos showing a dangerous use of that technology, open investigations by the National Highway Traffic Safety Administration (NHTSA), and the opinions of other experts in this space.”

Source: Al Jazeera and news agencies