
Tesla FSD Flunks School Bus Test: Autonomous Driving Safety Under Scrutiny
The promise of fully autonomous vehicles has always been tantalizing, but recent demonstrations of Tesla's Full Self-Driving (FSD) mode have sparked renewed concerns about the readiness of this technology for widespread deployment. While Elon Musk touts the imminent arrival of a self-driving Tesla capable of navigating from the factory to a customer's home, a concerning incident in Austin, Texas, paints a less rosy picture.
In a test conducted by The Dawn Project, a Tesla Model Y equipped with FSD software repeatedly failed to recognize a stopped school bus, complete with flashing lights and extended stop signs. Even more alarming, the vehicle struck child-sized mannequins placed in the bus's path during each of the eight test runs. This unsettling display raises serious questions about the reliability of Tesla's FSD system in crucial safety scenarios.
It's important to note that Tesla's FSD is currently marketed as a "supervised" system, requiring a fully attentive driver ready to intervene. Indeed, Tesla explicitly warns users that failure to adhere to these instructions could lead to "damage, serious injury or death." This begs the question: If the system requires constant human oversight, is it truly "self-driving"?
The Dawn Project, led by Dan O'Dowd, who also heads a company that develops competing automated driving systems, has been a vocal critic of Tesla's FSD. They've even launched advertising campaigns highlighting the potential dangers of the system, particularly its failure to yield to school buses. In light of these concerns, the Austin demonstration adds fuel to the ongoing debate surrounding the safety and efficacy of Tesla's autonomous driving technology. Furthermore, in April of this year, a crash involving a Tesla Model S using FSD resulted in the death of a motorcyclist in Washington, further compounding the issue.
Adding to the narrative, Tesla's plans for the Cybercab, a fully autonomous vehicle, have reportedly been delayed. While Musk initially suggested a launch date of June 22nd, he later tempered expectations, citing the company's "super paranoid" approach to safety. He also hinted at a potential milestone of a Tesla driving itself from the factory to a customer's home by June 28th. However, given the recent test results and past incidents, these pronouncements should be met with cautious optimism.
The core issue here isn't simply about Tesla's technology. It's about the broader implications of deploying autonomous systems in complex, real-world environments. While the potential benefits of self-driving cars – reduced accidents, increased mobility for the elderly and disabled, and improved traffic flow – are undeniable, these advantages can only be realized if the technology is demonstrably safe and reliable. The Austin demonstration serves as a stark reminder that the path to full autonomy is fraught with challenges, and that rigorous testing and validation are essential before these systems are unleashed on public roads.
1 Image of Tesla FSD:

Source: Engadget