A man conducted a highly dangerous experiment by painting a wall like a road to test Tesla’s self-driving ability to its limits.
In a recent experiment, a YouTuber surprisingly tested Tesla’s self-driving capabilities.
Mark Rober, a former NASA engineer, wanted to see if Tesla’s Autopilot could tell the difference between a real road and an optical illusion.
His test involved a wall painted to look like a road, and the results were shocking.
Man dangerously tests Tesla’s self-driving by painting a wall as road to see if it detects an illusion
Rober’s goal was simple: push Tesla’s self-driving technology to its limits.
He used a lightweight styrofoam wall that looked like a road to see if the car would recognize it as an obstacle.

The experiment was designed to understand how Tesla’s optical camera system works compared to other self-driving technologies.
Using a setup similar to a Wile E. Coyote trap, Rober filmed the test to capture the car’s reactions.
He wanted to know if Tesla’s cameras, which rely on image processing, could identify a fake road.
Tesla failed to recognize the wall during testing.
During the test, a Tesla driving in Autopilot mode went straight through the wall without hitting the brakes.
The high-speed footage showed how easily the car was tricked by the optical illusion.
“I can definitely say for the first time in the history of the world, Tesla’s optical camera system would absolutely smash through a fake wall without even a slight tap on the brakes,” Rober said.
This result raised questions about the effectiveness of Tesla’s camera-only approach.
The car’s system was designed to respond to obstacles, but in this case, it failed to recognize a fake wall.

Comparing various self-driving technologies reveals significant differences.
Rober compared Tesla’s system with other self-driving technologies that use more advanced tools like LiDAR and radar.
These technologies are used by many of Tesla’s competitors and can scan the environment in three dimensions.
In the same test, a vehicle equipped with LiDAR easily recognized the wall as an obstacle and stopped in time.
This comparison emphasized the differences in technology.
While Tesla relies solely on cameras, LiDAR systems can provide a more detailed understanding of the surroundings.
Additional safety tests conducted for evaluation.
Rober conducted multiple tests at a speed of 40 mph. In one test, both the Tesla and a LiDAR-equipped car approached a dummy.
Although the Tesla detected the dummy, it did not brake quickly enough and collided with it.

Rober explained that Tesla’s automatic emergency braking system only activates when it is sure there is a problem.
This design aims to reduce false positives, but it can lead to dangerous situations.
In further tests, Rober evaluated how both cars handled challenging weather conditions, such as fog and rain.
The LiDAR-equipped car performed well, but the Tesla struggled and often made contact with the dummy.
These results raised serious concerns about the Tesla system’s safety in real-world conditions.
Community reactions and debates
The video of Rober’s experiment sparked intense discussions among Tesla fans and the general public.
Some viewers suggested that he should have used Tesla’s more advanced Full Self-Driving (FSD) software for a fairer test.
Despite this, the experiment brought attention to significant questions about the safety and reliability of Tesla’s camera-only system.
Many users in the comments expressed their thoughts, debating the advantages and disadvantages of different self-driving technologies.
While some defended Tesla, others pointed out the risks associated with relying solely on cameras.

One user said: To be fair half of yall can’t drive with your own eyes so you ain’t no better than Tesla.
The second user added: Saw the video of this. The Tesla failed on the painted wall, in dense fog and hard rain.
The third user commented: You pay extra for wall painted like a road detection capability. I gotta think that is an extremely rare occurrence.
The fourth user wrote: Used the old autopilot tech not FSD and he disengages the autopilot before he even hits the wall.
Another user said: He was already called out because he was only using the autopilot.
He wasn’t using the car in full self-driving mode and claimed he was.
Someone added: Except it wasn’t FSD. It was autopilot.