The US National Highway Traffic Safety Administration (NHTSA) has started an investigation into Tesla’s Full Self-Driving system in low visibility conditions.
The NHTSA investigation has been triggered due to 4 different incidents of Tesla vehicle crashes. These incidents happened while the cars were driven by the Tesla Autopilot/FSD system in low visibility situations.
Tesla FSD is still an Advanced Driver’s Assistance System (ADAS) which the automaker calls FSD Supervised meaning the person in the driver’s seat is required to remain attentive at all times.
However, according to the NHTSA document on the Tesla FSD probe (PDF below), these 4 crashes were either reported by the Standing General Order (SGO) or media reports.
The Office of Defects Investigation (ODI) has identified four Standing General Order (SGO) reportsin which a Tesla vehicle experienced a crash after entering an area of reduced roadway visibilityconditions with FSD -Beta or FSD -Supervised (collectively, FSD) engaged. In these crashes, thereduced roadway visibility arose from conditions such as sun glare, fog, or airborne dust. In one of the crashes, the Tesla vehicle fatally struck a pedestrian. One additional crash in these conditionsinvolved a reported injury. The four SGO crash reports are listed at the end of this summary bySGO number.
NHTSA starts the Tesla FSD probe for vehicle crashes in low roadway visibility conditions.

The NHTSA SGO requires vehicle manufacturers to report crash incidents if the involved vehicles are equipped with automated driving systems or SAE Level 2 advanced driver assistance systems.
As we can see in the above screenshot from the NHTSA report, there is a total of more than 2.4 million Tesla vehicles affected with this issue. Out of these 2.4 million Tesla cars manufactured between 2016 and 2024, NHTSA found only 4 incidents of Autopilot crashes that triggered the investigation.
Out of these 4 of crash or fire incidents, 1 instance resulted in injury and 1 was a fatal accident. This statistic needs to be seen in a comparison perspective of human vs. Tesla FSD miles per fatality.
According to a 2022 report by the Insurance Institute for Highway Safety (IIHS), there are 1.33 deaths per 100 million miles driven by humans in the United States. The report states:
There were 42,514 deaths from motor vehicle crashes in the United States in 2022. This corresponds to 12.8 deaths per 100,000 people and 1.33 deaths per 100 million miles traveled.
In its Q2 2024 company report, Tesla (TSLA) published the number of FSD miles driven till June 2024. Since the launch of FSD v12 last year, the number of autonomous miles driven by FSD have grown exponentially.
From June 2021 to June 2024, Tesla Autopilot FSD including v12 drove more than 1.6 billion miles. Extrapolating IIHS fatality stats for 1.6 billion miles, human drivers cause 21.28 fatal incidents (1.33 multiplied by 16).
Tesla Autopilot on the other hand has been probed for 4 incidents with only 1 fatality. NHTSA however did not mention the time range these incidents happened between. Even if we consider Tesla Autopilot fatalities caused in the last year alone, the crash occurrence is far less compared to human drivers in terms of miles driven (see graph below).

The issue is not the NHTSA investigation into the matter but the problem is with the mainstream media (MSM) outlets. MSM is portraying this investigation as the end of Tesla as a company the complete failure of its Full Self-Driving system and spreading fear, uncertainty, and doubt (FUD).
But the reality is the opposite, the numbers prove that FSD is constantly becoming safer than human drivers as Elon Musk has been predicting for years. “Supervised FSD is vastly safer than human driving,” he wrote on X (Twitter) last year.
In the last few months, the FSD user base has grown even more in the United States and Canada and the internal and external testing has increased exponentially. So, till October, it can be safely assumed that FSD miles driven have come close to the 2 billion miles mark if hasn’t crossed this milestone already. Tesla (TSLA) will most probably share the Q3 2024 FSD miles in tomorrow’s earnings call and financial update docket.
“It’s amazing by most standards, but we are aiming for 1000% safer than the average human driver,” Musk stated the goal of Tesla FSD in a tweet years ago.
Tesla will most probably be able to correct Autopilot/FSD Vision for low visibility conditions with one or several OTA software updates. The automaker improved Autopilot camera visibility with a software update last year. In 2021, the automaker also improved the backup camera darkness issue with an OTA update.
It has just been a few days since this investigation started, we need to wait for the conclusion, so stay tuned as we bring you future updates on this matter.
Update: Tesla just confirmed that FSD Supervised has crossed the 2 billion miles mark and half of these miles are driven on V12.
This means if the NHTSA has considered only V12 for the crash safety data, there has been only 1 fatality out of 1 billion FSD miles driven which is roughly 10X lower than human drivers.
Stay tuned for constant Tesla updates, follow us on:
Google News | Flipboard | X (Twitter) | WhatsApp Channel | RSS (Feedly).
Related Autopilot / FSD News
- Tesla FSD v13 performs unexpectedly well on unmarked Chinese roads (video)
- Tesla launches FSD in China — watch first impressions of v13
- Tesla FSD v13 beta tester marks subtle but significant improvements in the latest versions
- A glimpse of Unsupervised FSD, watch Tesla owner eat as her car drives itself on Autopilot
- Tesla FSD v13.2.7 (2024.45.32.15) rollout gets wider (review videos)
- FSD v12.6.3 is the best version for HW3/AI3 Tesla cars yet (review videos)