Last week, a prominent YouTuber who claims to be an ex-NASA and Apple engineer posted a video of Tesla Full Self-Driving (FSD) hitting a wall painted with a fake road on it.
We’ve all seen the Road Runner cartoons in our childhood. In an episode, Wile E. Coyote paints a wall with a fake road when, in fact, it is a dangerous end of the road with an opening downhill. However, Road Runner passes through it safely, and in disbelief, Wile E. Coyote falls when he tries to run across the painted wall.
This was Mark Rober’s inspiration for testing the Tesla FSD in his legacy Model Y electric SUV. He ran a bunch of other tests comparing it to a vehicle equipped with a Lidar. Both vehicles went through 6 different self-driving safety tests where Tesla Vision AI passed only 3 tests while the Lidar-equipped SUV passed all of them.

Mark Rober’s Dubious FSD Testing
However, looking at the tests closely, especially the last one (Wile E. Coyote painted wall), the Tesla Community observed some discrepancies. Surprisingly, Mark Rober didn’t use the Autopilot feature properly when performing these tests.
In the last attempt, as the camera showed the center display, Mark didn’t engage Tesla Autopilot (FSD) till the last moments. This resulted in the car not braking for the fake wall. The visualizations on the screen also did not show that a hurdle was in front of the car.
There’s another issue with Mark’s tests. He did not show the version of FSD software installed on his Model Y used in these tests. The model year of this vehicle was also not revealed which determines if the car has the legacy Hardware 3 (HW3/AI3) or HW4 (AI4) computer in it.
Both AI4 and AI3 computers would’ve failed if you accelerated the vehicle manually to ~40 mph and engaged Autopilot at the last moment before hitting a wall. Following harsh criticism, Mark posted the following raw footage of the painted-wall test on X (Twitter).
I have some questions:
— Sawyer Merritt (@SawyerMerritt) March 17, 2025
1) Your video "Can You Fool a Self-Driving Car?" uses Luminar’s latest tech but not Tesla’s latest FSD software. Why?
2) Autopilot was turned on at 42 MPH in your YouTube video but you turned it on at 39/40 MPH in your clip above. Why? Multiple takes?
3)… pic.twitter.com/TE0BZozNL8
Re-Testing Tesla FSD for Painted-Wall Crash
In the first 7 days of its release, Mark Rober’s Tesla FSD painted-wall-crash video has received around 17.5 million views. Since the tests have been technically wrong as we’ve explained above, a re-creation of the same scenario was needed.
Tesla YouTuber Kyle Paul re-created the same scenario by putting a painted wall on the road. He first tested FSD v12 on a legacy AI3 Model Y. The tests showed that Tesla FSD v12.5.4.2 (2025.2.6) did not see the wall until it was too late. Kyle had to intervene and apply the brakes manually to stop the vehicle before it hit the fake road-painted wall.
The HW3 FSD version 12.5.4.2 is quite old now. FSD v12.6.3 earned good reviews from the beta testers. Re-testing with the FSD v12.6.3 version might give better results.

However, Cybertruck with FSD v13 passed the Wile E Coyote Road Runner-style painted wall test. Tesla Cybertrucks come with the latest AI4 computer in them.
The Cybertruck getting tested for this scenario had FSD v13.2.8 with the firmware software version 2024.45.32.20 installed on it. With the newer generation of hardware and software, the Cybertruck was expected to perform better than the older Model Y in this test.
Interestingly, Cybertruck with HW4 and FSD v13.2.8 did see the fake painted wall and automatically applied the brakes in time to save the possible crash. No human intervention was made in this test.
Verdict
Tesla (TSLA) is constantly improving the Full Self-Driving (FSD) software and hardware to ultimately achieve autonomous driving for the first time in human history. The American tech automaker has the infrastructure (neural net AI data centers like Cortex) and the software innovation to achieve this goal.
Even if an AI3/HW3 Tesla has failed at this test, Tesla Vision AI will catch-up with the problem and can be trained to handle such situations (which are honestly far-fetched in real-world driving).
Tesla uses only cameras (front, side repeaters, B-pillars, rear, and now bumper as well) to form the vision of the vehicle. Although the teardown of the AI4/HW4 computer revealed that Tesla has an HD radar installed in new vehicles the automaker hasn’t yet started using it for real-life Autopilot/FSD driving.
The redundancy of the radar might come in handy in future versions of Tesla Full Self-Driving. This will help Tesla vehicles make better decisions in scenarios such as we observed in this weird test.
Stay tuned for constant Tesla updates, follow us on:
Google News | Flipboard | X (Twitter) | WhatsApp Channel | RSS (Feedly).
Related Autopilot/FSD News
- Tesla launches FSD Early Access Program in the US, here’s how to enroll your car in it
- A Tesla Model 3 spotted probably testing for Unsupervised FSD for the Texas launch in June
- Chinese Tesla owners take FSD v13 testing to the extreme on dangerous mountain roads and more (videos)
- Tesla re-engineers the brake system in the new Model Y Juniper for Autopilot and human drivers
- Tesla FSD v13 passes the ‘painted wall test’ while FSD v12 fails — Tesla owner debunks Mark Rober’s dubious test
- Tesla FSD passes the water cup challenge in China — watch the max smooth drive on city streets