Reading view

There are new articles available, click to refresh the page.

Deadly Tesla Crash Raises Questions About Vision-Based Self-Driving Systems

  • A 2023 fatal crash in Arizona is linked to Tesla’s Full Self-Driving software system.
  • The incident raises questions about Tesla’s vision-only autonomous driving strategy.
  • It coincides with Tesla’s Robotaxi push and sparks concerns over autonomous readiness.

When tech collides with the real world, the consequences are rarely theoretical. In late 2023, a tragic accident happened in Arizona. Of 40,901 traffic fatalities that year, it was unique. It was the only one that involved a pedestrian and a Tesla reportedly running on Full Self-Driving (Supervised) mode. Now, as Tesla begins its Robotaxi launch in Austin, it’s raising questions about safety now and in the future.

The accident happened in November of that year when Johna Story, a 71-year-old grandmother, was pulled over on the interstate. She was stopped in an effort to help others who had already been in an earlier accident. Video from the Tesla shows that the roadway leading up to the crash was obscured by direct sunlight on the horizon.

More: Dodge Says Charger Daytona’s Unintended Acceleration Is A Feature Not A Bug

That said, the video obtained by Bloomberg of the crash does show warning signs that something was wrong. While the roadway is impossible to see, the car in the right lane slows down. Other vehicles are parked on the right shoulder. A bystander was waving their hands for traffic to slow down.

Before he knew it, Tesla driver Karl Stock was veering left, then back toward the road before hitting a parked Toyota 4Runner and Story head-on. She passed away at the scene. “Sorry everything happened so fast,” Stock wrote in a witness statement for police. “There were cars stopped in front of me and by the time I saw them, I had no place to go to avoid them.”

Notably, Bloomberg claims that FSD was engaged at the time of the accident. “He [Stock] had engaged what the carmaker calls Full Self-Driving, or FSD,” the report claims. This isn’t substantiated by the police report. Neither the reporting officers nor Stock mentions FSD, Autopilot, or any sort of cruise control or autonomous system. That said, it’s possible that the publication gained access to the non-public NHTSA crash report and that more data is available there.

Vision Vs. Lidar & Radar

Ultimately, crashes like this highlight what seems like the most obvious concern for Tesla’s FSD. Vision-based systems aren’t wildly dissimilar from the way that humans perceive the road. That means that when humans struggle to see the roadway ahead, as is the case with bright sunlight on the horizon, or in smoke-filled or foggy conditions, vision-based systems can struggle too.

As mentioned, it seems unclear when exactly FSD was engaged and when it wasn’t. That said, even if the system disengaged in time for Stock to avoid the crash, it’s unclear how he would’ve seen what was coming to do so. In fact, this crash and others like it, albeit without additional fatalities, led the NHTSA to kick off an investigation into FSD that is still ongoing.

“A Tesla vehicle experienced a crash after entering an area of reduced roadway visibility conditions with FSD -Beta or FSD -Supervised (collectively, FSD) engaged. In these crashes, the reduced roadway visibility arose from conditions such as sun glare, fog, or airborne dust,” the investigation called out.

 Deadly Tesla Crash Raises Questions About Vision-Based Self-Driving Systems

On the flip side, systems that rely on radar or lidar can ‘see’ beyond fog, light glare, and smoke. They can pick up on obstacles that vision-based systems sometimes have real trouble with. In this case, a lidar-equipped system could’ve potentially alerted Stock to the stopped obstacles. That doesn’t make them perfect.

Cruise famously shut down after billions worth of investment because of crashes. Those cars all used radar and lidar and still failed. All of that said, it’s still a bit of a wonder as to why Tesla and its CEO, Elon Musk, are so staunch in their use of vision-only systems. Only time will tell if that changes.

The Robot Elephant In The Room

 Deadly Tesla Crash Raises Questions About Vision-Based Self-Driving Systems

We might learn sooner rather than later whether or not Tesla sticks with its vision-only system. The automaker is already testing robotaxis and driverless cars and is set to expand that this month in Austin, Texas. Musk has promised that the program will expand throughout the year and that Level 5 autonomous driving is coming soon.

Of course, Tesla has continually improved FSD over the years. It’s a dramatically more capable system than it was in 2023 but it still has some major issues. Just a few weeks ago, a Tesla, reportedly with FSD engaged, crashed on an open road with no obstacles, no visual queues, or any other explicable reasoning. We’ve yet to confirm the details, but in the video, the car literally drove off of the road and into a tree at around 55 mph. These two crashes are things that Tesla’s Robotaxis cannot do if the automaker ever wants them to be mainstream. For now, there’s little more to do than to wait and see what happens.

Tesla’s Take

Here is the full, 36-page crash report of that Arizona incident, with both police and eye-witness reports and everything: https://t.co/wvfvgl8ET3 pic.twitter.com/3y5DSgDzpU

— Jaan of the EVwire.com ⚡ (@TheEVuniverse) June 4, 2025

The automaker is famous for its lack of a PR department [until it really wants to get a message out]. That said, it does sometimes speak about why it continues to push for Autopilot and FSD usage among its customers.

Two years ago, when it recalled over a million cars, it said, “We at Tesla believe that we have a moral obligation to continue improving our already best-in-class safety systems. At the same time, we also believe it is morally indefensible not to make these systems available to a wider set of consumers, given the incontrovertible data that shows it is saving lives and preventing injury.”

While Tesla is notoriously opaque about the safety data it gathers, it does claim that its cars are safer on average than human drivers. Since third parties don’t have full access to that data to validate it, it’s hard to simply accept those claims. Nevertheless, if they’re accurate, Tesla has a point. At the end of the day, nobody in this equation wants to risk lives. The question is, which route is the safest, not just in the future, but right now?

Lead image Bloomberg/YouTube

❌