U.S. Regulators Turn Up Heat on Tesla’s Full Self-Driving Software After Crash Deaths

Source article image

The National Highway Traffic Safety Administration’s Office of Defects Investigation (ODI) has intensified its investigation into Tesla’s Full Self-Driving (Supervised) software. The probe, launched in October 2024, now advances to an “engineering analysis”-the agency’s highest scrutiny tier. This step often precedes a recall order.

This escalation follows four reported crashes involving the software in low-visibility conditions, including one that killed a pedestrian. ODI has since flagged additional incidents where the system failed to handle poor visibility, suggesting the problem may be more widespread.

Why this matters for drivers and investors

For Tesla owners, the stakes are real. The investigation targets the software’s inability to detect degraded camera performance or alert drivers in time to prevent accidents. ODI’s review found that in several crashes, the system either lost track of vehicles ahead or failed to spot them entirely. “The system did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred.”

Beyond safety, this probe could force Tesla to issue a recall, impacting thousands of vehicles and potentially delaying its push for a robotaxi service in Austin, Texas. The company has spent months hyping its autonomous ambitions, but regulatory action could slam the brakes on deployment and shake investor confidence.

Regulator frustration and missing data

ODI says Tesla started working on a fix for the low-visibility bug in June 2024-before the official probe began. But the agency claims it still hasn’t received key details, like whether the update was actually rolled out or which vehicles got it. ODI also suspects that crash data may be underreported due to Tesla’s own data collection and labeling limitations.

“Review of Tesla’s responses revealed additional crashes that occurred in similar environments and where the system either did not detect a degraded state, and/or it did not present the driver with an alert with adequate time for the driver to react.”

What’s next?

This is one of two ongoing federal investigations into Tesla’s Full Self-Driving (Supervised) suite. The other focuses on more than 80 cases where the software allegedly broke basic traffic laws, like running red lights. The engineering analysis phase means ODI is digging deeper, and a recall could be on the table if the agency finds systemic safety flaws.

Speculation: If ODI forces a recall, Tesla may have to halt software rollouts or even pull features from existing vehicles. That would be a major blow for drivers banking on hands-free tech-and for Elon Musk’s robotaxi dreams.

The bottom line

  • ODI’s probe into Tesla’s Full Self-Driving (Supervised) is now at its highest level.
  • Regulators are frustrated by missing data and unclear software fixes.
  • A recall is possible if safety flaws are confirmed, with real impact for Tesla owners and investors.