Image Credit: Tesla
The National Highway Traffic Safety Administration closed a long-running investigation of Tesla’s Autopilot driver assistance system after reviewing hundreds of crashes linked to its misuse, which included 13 deaths and “numerous serious injuries.”
Also, NHTSA is launching a new investigation to evaluate whether the Autopilot recall fix implemented by Tesla in December is effective enough.
NHTSA’s defect investigation office said in documents released Friday that it had completed “an extensive work” that found evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissible operating capabilities.”
“This mismatch resulted in a significant safety gap between drivers’ expectations of (Autopilot’s) operational capabilities and the system’s actual capabilities,” the agency wrote. “This gap led to potential misuse and avoidable accidents.”
The conclusion of the preliminary investigation, which began in 2021, marks the end of one of the most visible efforts by the government to investigate Tesla’s Autopilot software. However, Tesla is still feeling the pressure of many other inquiries.
The Justice Department is also investigating the company’s claims about the technology, and the California Department of Motor Vehicles has accused Tesla of falsely advertising the capabilities of Autopilot and more advanced full self-driving beta software. The company is also facing many lawsuits regarding autopilot. Meanwhile, according to CEO Elon Musk, Tesla is now “working hard for autonomy.”
NHTSA said its investigation reviewed 956 crashes reported through Aug. 30, 2023. In nearly half of them (489), the agency said there was either “insufficient data to make an assessment”, another vehicle was at fault, Autopilot was found not to be in use, or the accident was otherwise unrelated to the investigation.
The remaining 467 crashes fell into three buckets, NHTSA said. There were a number (211) of crashes where “Tesla’s frontal plane struck another vehicle or obstacle with enough time for an attentive driver to avoid or mitigate the crash.” It said that in 145 crashes Including “departure from the roadway in low traction conditions such as wet roadways,” and it said 111 crashes involved “departure from the roadway where autosteer inadvertently disengaged from driver inputs.”
“These crashes are often severe because neither the system nor the driver responds appropriately, resulting in high speed differentials and high energy crash outcomes,” the agency wrote.
Tesla tells drivers they need to pay attention to the road and keep their hands on the wheel when using Autopilot, which it measures through torque sensors and in-cabin cameras in its new cars. But NHTSA and other safety groups have said these warnings and investigations don’t go far enough. In December, NHTSA said that these measures were “insufficient to prevent abuse.”
Tesla agreed to issue the recall through a software update that would theoretically increase driver monitoring. But that update didn’t actually appear to change Autopilot much — a sentiment with which NHTSA appears to agree.
According to NHTSA, parts of that recall fix require “the owner to opt in” and allow Tesla drivers to “easily reverse” some safety measures.
NHTSA spent nearly three years working on the investigation of Autopilot, and met or spoke with Tesla several times throughout the process. It conducted several direct investigations of the accidents, and also relied on the company to provide data about them.
But the agency criticized Tesla’s data in a supporting document.
“Laps in Tesla’s telematic data create uncertainty about the actual rate at which vehicles operating with Autopilot are involved in crashes. Tesla does not know about every accident involving Autopilot, even serious ones, due to gaps in telematic reporting, NHTSA wrote. According to the agency, Tesla “largely only receives data from crashes with pyrotechnic deployment,” meaning when the air bag, seat belt pre-tensioner, or car’s hood’s pedestrian impact mitigation feature is triggered. .
NHTSA claims that limiting it to this level means Tesla is only collecting data on about 18% of crashes reported to police. As a result, NHTSA wrote that the investigation discovered crashes for which Autopilot was engaged, but which were not notified to Tesla via telematics.