Tesla to recall 362,000 US vehicles on full self-driving software
Tesla software allows a vehicle to ‘exceed speed limits or travel through intersections in an unlawful’ manner, says the US National Highway Traffic Safety Administration.
Tesla Inc has said it would recall 362,000 United States vehicles to update its Full Self-Driving (FSD) Beta software after US regulators said the driver assistance system did not adequately adhere to traffic safety laws and could cause crashes.
The National Highway Traffic Safety Administration (NHTSA) on Thursday said the Tesla software allows a vehicle to “exceed speed limits or travel through intersections in an unlawful or unpredictable manner [that] increases the risk of a crash.”
Tesla will release an over-the-air (OTA) software update free of charge, and the electric vehicle (EV) maker said is not aware of any injuries or deaths that may be related to the recall issue. The automaker said it had 18 warranty claims.
Tesla shares were down 1.6 percent at $210.76 on Thursday afternoon.
The recall covers 2016-2023 Model S, Model X, 2017-2023 Model 3, and 2020-2023 Model Y vehicles equipped with FSD Beta software or pending installation.
NHTSA asked Tesla to recall the vehicles, but the company said that despite the recall, it did not concur with NHTSA’s analysis.
The move is a rare intervention by federal regulators in a real-world testing programme that the company sees as crucial to the development of cars that can drive themselves. FSD Beta is used by hundreds of thousands of Tesla customers.
The setback for Tesla’s automated driving effort comes about two weeks before the company’s March 1 investor day, during which Chief Executive Elon Musk is expected to promote the EV maker’s artificial intelligence capability and plans to expand its vehicle lineup.
Tesla could not immediately be reached for comment.
NHTSA has an ongoing investigation it opened in 2021 into 830,000 Tesla vehicles with driver assistance system Autopilot over a string of crashes with parked emergency vehicles. NHTSA is reviewing whether Tesla vehicles adequately ensure drivers are paying attention. NHTSA said on Thursday that despite the FSD recall, its “investigation into Tesla’s Autopilot and associated vehicle systems remains open and active”.
Tesla said in “certain rare circumstances … the feature could potentially infringe upon local traffic laws or customs while executing certain driving maneuvers.”
Possible situations where the problem could occur include travelling or turning through certain intersections during a yellow traffic light and making a lane change out of certain turn-only lanes to continue travelling straight, NHTSA said.
NHTSA said “the system may respond insufficiently to changes in posted speed limits or not adequately account for the driver’s adjustment of the vehicle’s speed to exceed posted speed limits”.
Last year, Tesla recalled nearly 54,000 US vehicles with FSD Beta software that may allow some models to conduct “rolling stops” and not come to a complete stop at some intersections, posing a safety risk, NHTSA said.
Tesla and NHTSA say FSD’s advanced driving features do not make the cars autonomous and require drivers to pay attention.