Tesla On FSD Suddenly Swerves And Crashes Into A Tree, Claims Driver

- A new Tesla Model 3 crash reportedly happened while running on Full Self-Driving (Supervised).
- Video from the car shows it driving across the oncoming traffic lane, into grass, and ultimately a tree.
- If this video is everything it purports to be, Tesla will need to sort out exactly what happened ASAP.
Autonomous driving may be the future, but the present still has a lot of explaining to do. Especially when cars with so-called “Full Self-Driving” capabilities start careening off the road for no obvious reason.
That said, it’s rare to see what we just have in a newly released set of videos involving a Tesla. According to the title, it shows a crash while running what Tesla calls its autonomous system, Full Self-Driving (Supervised). What’s worse, though, is that it seems to do so without rhyme or reason in broad daylight with no traffic on a straight road.
More: Tesla Stiffs Cybertruck Owners On Another Promised Feature
Tesla famously uses vision-based software and hardware to run its semi-autonomous Autopilot and Full Self-Driving (Supervised) software. In theory, it makes complete sense since we humans also drive almost entirely via vision-based mechanics. In practice, though, there are some major concerns, and this video highlights them. We’ll circle back to that.
The Incident: Straight Road, Sharp Left Turn
A YouTube channel recently uploaded four videos showing each side of a car during a crash. They say this is a Model 3 and that it’s running FSD 13.2.8, which is almost the latest available version. On May 11, Tesla released 13.2.8, but this crash happened on February 26 so indeed, it was up to date given that information.
That said, what the video shows is the most shocking part of this entire situation. Across three of the four clips, we see the car moving for 45 seconds. In all of them, everything appears totally normal for the first 31 seconds as the car trundles down a two-lane road. Then, just as a car passes going the opposite way, all hell seems to break loose.
The car turns hard to the left, goes across the opposing traffic lane, goes off the road, and hits a tree before rolling over. From the moment it begins to turn to the moment it impacts the tree is less than three seconds. While that’s tough to swallow, it’s the conditions that really make this a bad deal for Tesla.
The road was perfectly straight. This appears to be at some time in the relatively early or later part of the day as the shadows cast on the ground are long. Despite that, the sun is bright and seemingly unobstructed by clouds, so there’s no lack of lighting in the scene. Finally, there’s no complex traffic situation here with markings, other cars, or road signs.
Still, for whatever reason, it appears as though this car allegedly on FSD just decided that it needed to leave the roadway and did exactly that. Adding even more confusion to this crash are videos of YouTubers testing FSD against inanimate objects on the road. In almost every case, the technology focuses on slowing itself down, stopping even, to avoid an obstacle. Very rarely does it try this sort of hard steering input at speed.
The Lidar Elephant in the Room
And this brings us back to vision-based autonomous driving systems. Again, we humans use vision to determine how to control our cars. Tesla is trying to do that too, but it’s caught flack, and I suspect is about to catch far more, over its choice to skip using lidar and radar tech.
While vision can work, and obviously does for most people on most days, Lidar and radar offer the ability to easily see through bad weather conditions like fog or haze. They could simply be used as a redundancy to confirm what a vision-based system thinks it sees too. Nevertheless, Tesla ditched it years ago and its CEO Elon Musk appears committed to never bringing it back.
Reports From The Driver
According to the person who posted the videos on Reddit, he was going around 55 mph when the crash happened. He says of the experience, “I loved the FSD until this happened. I was a full believer in autonomous vehicles until this happened to me. Lesson learned.” Thankfully, the only injuries he suffered included a cut on his chin, some lower-back discomfort, and “emotional damage,” as he calls it.
It’s worth pointing out that there are many unknowns here. While there appears to be no reason to suspect these videos and their description are inaccurate, there could be more to the story that we’re not being told. If that doesn’t end up being the case, though, Tesla is likely in a lot of hot water over this. The owner has submitted requests for all of the data relating to the crash so hopefully more of that sees the light of day.
Previous crashes involving the software typically offered some sort of purchase for Tesla defenders to cling to. Based on everything available in the four videos here, it appears as though FSD just made its most blatant mistake in the public sphere.
If this is possible with the hardware and software running Tesla’s planned Robotaxi service, it might have to be even more careful than it’s already planning on being. When asked if he’d ever buy another Tesla, the owner of this car’s words were damning. “I want another but would NEVER use FSD again.” Yeah, I think we can all understand why.
