Tesla is once again under the radar of mainstream media and press as a 2019 Model S faced an accident when it was allegedly running on Autopilot Full Self-Driving (FSD). Authorities suspect that there was no driver in the car at the time of the incident.
Unfortunately, two precious lives were lost in this accident. While the actual cause of this accident is still under investigation, the media is trying to put the entire blame on Tesla and its Autopilot FSD software.
Currently, a limited number of Tesla owners have the FSD Beta software on their cars, and a big chunk of these drivers are Tesla employees testing the system.
Tesla has clearly outlined the precautions and safety measures while using its self-driving beta software — basically, the driver needs to stay alert to take over the driving in case Autopilot is not able to handle a situation.
Tesla FSD Beta software is not supposed to work at all if there is no occupant in the driver’s seat. However, mainstream media reports, such as from abc13 suggest that investigators are sure that there was no one in the driver’s seat when this crash happened.
Now, a Tesla owner and YouTuber who happens to have the FSD Beta software on his Model 3 demonstrates what happens when you unbuckle the seat belt while driving on Autopilot FSD. Let’s watch first:
As we witnessed in the video above, Tesla Autopilot disengages automatically as soon as the driver unlatches the seatbelt. Not just disengage, the FSD software also makes the car come to a stop and even pulls the car over for further safety.
If someone has intentionally bypassed these checks by cheating the system somehow, it is unlikely that Tesla will be held responsible for the outcomes.
Commenting on the recently released Tesla Autopilot Q1 2021 safety report, “Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle,” CEO Elon Musk tweeted.
In the 1st quarter, we registered one accident for every 4.19 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.05 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 978 thousand miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.
Tesla safety report
This means that there is a 10X lower chance of getting involved in an accident using Tesla Autopilot FSD software and 5X lower with just the basic active safety features.
Even if a Tesla vehicle isn’t loaded with the $10k Full Self-Driving package, the AI software keeps working in shadow mode. In case of a 100% probability of a crash, it can kick in to avoid the obvious accident.
For newer cars with Autopilot Hardware 2.5 and above, the Tesla Active Safety using the basic Autopilot features comes standard (2018 Model 3 onwards).
The cars Tesla is referring to as without active safety in the above safety report are actually older Tesla models without the new Autopilot hardware, usually called AP1 Tesla cars. Even those cars have a 2X lower probability of an accident compared to the average US stats.
Related: Tesla owner leaves the car on Autopilot while filming from the passenger seat
To stay tuned with more Tesla news and stories, follow us on:
Google News | Flipboard | RSS (Feedly).
Latest Tesla FSD News
- Tesla update 2026.2.9 in Europe shows FSD v14-like blue underglow in Summon visualizations (release notes)
- Musk promises a $30K or less Cybercab delivery before 2027, Tesla (TSLA) produces the first unit at Giga Texas
- Outright purchase of Tesla FSD is still available for the Model S and X (Luxe Package), and the Cyberbeast
- Tesla update 2026.2.3: hidden features, Service Mode FSD updates, battery recalibration, more
- Lemonade offers 50% insurance discount for Tesla owners using the safer Full Self-Driving (FSD)
- Watch Tesla FSD give way to an ambulance before the human driver could even hear it









my response to this response: https://www.youtube.com/watch?v=VS5zQKXHdpM
What you’re doing is actually “illegal”, so Tesla does not expect anyone to do anything illegal while the car is running.