A car is driving along a country road, the driver is relaxed. Suddenly, the steering wheel turns automatically and the car plunges into the ditch. The driver? Was just a passenger at that moment. That's what happened.
By Dieter Petereit
24.05.2025, 20:15
An incident in the US state of Alabama in which a Tesla Model 3 with "Full Self-Driving" (FSD) activated rolled over is currently causing a stir. As Electrek reports, the vehicle left the road for no apparent reason, collided with a fence and landed on its roof. The driver "Wally" escaped with minor injuries.
"I was on my way to work and had Full Self-Driving on," Electrek quotes him as saying. "The steering wheel suddenly started to turn quickly, the car went into the ditch, hit a tree and overturned. I didn't have time to react."
That would be very worrying: shadows as the cause of the accident?
The Tesla was equipped with the latest hardware 4 and a current FSD software version (v13.2.8). An official cause of the accident is still pending, but a controversial theory is being discussed in the community. In a much-noticed thread on Reddit, users speculate that the system may have misinterpreted the shadows of trees on the road as an obstacle and initiated an abrupt evasive maneuver.
The connection to the installed technology is piquant. Last year, we at t3n already reported on various problems and an increased error rate in the Hardware 4. It is unclear whether there is a connection to the current crash, but it raises further questions about the reliability of the platform.
Not an isolated case: Tesla's controversial autopilot software
The incident is the latest in a long series of incidents that have raised doubts about the safety of Tesla's driver assistance systems. According to a report by Manager Magazin, Autopilot was involved in over 700 accidents with 17 fatalities in the USA alone by mid-2023. The technology has also been criticized in Germany; the Federal Motor Transport Authority (KBA) in Flensburg has already launched investigations into the problem of so-called phantom braking.
In contrast, the electric car manufacturer Tesla from Austin, Texas, regularly refers in its official vehicle safety report to a significantly lower accident rate when driving with Autopilot activated compared to driving without assistance. However, critics criticize the methodology of this data collection and the misleading term "Full Self-Driving", which suggests a lack of autonomy.
The current crash underlines the serious discrepancy between marketing promises and technological reality. It shows that even the latest systems can still fail due to everyday phenomena such as a shadow being cast. Drivers are left with the bitter realization that they cannot rely on the technology, but may not even have the chance to intervene at the crucial moment.
https://t3n.de/news/tesla-fsd-crash-schatten-ursache-1689510/