In a disturbing incident this March, a North Carolina high school student suffered severe injuries after being hit by a Tesla Model Y. The vehicle, traveling at significant speed and operating under Tesla’s Autopilot system, failed to stop as the student disembarked from a school bus.
This case is not isolated. According to a recent federal investigation, it is part of a troubling pattern linked to Tesla’s advanced driver-assistance systems, Autopilot, and its more sophisticated counterpart, Full Self-Driving (FSD).
Tesla: Investigative Findings Highlight Flaws and Driver Complacency
The National Highway Traffic Safety Administration (NHTSA) has documented a concerning trend through its examination of 956 Tesla-related crashes since January 2018. The agency discovered that drivers were often disengaged, relying too heavily on technology that is, as yet, not foolproof.
Despite the company’s claims that these systems are stepping stones to fully autonomous driving, NHTSA’s findings reveal that both Autopilot and FSD fail to keep drivers sufficiently connected to the task of driving.
“Drivers using Autopilot or the system’s more advanced sibling, Full Self-Driving, were not sufficiently engaged in the driving task,” the agency noted, emphasizing the lack of adequate mechanisms to ensure driver alertness.
A Closer Look at the Crashes
The statistics are alarming. Of the nearly a thousand incidents reviewed, some involved other vehicles impacting the brand, but many were due to the brand striking other vehicles or obstacles.
The resulting human toll includes 29 fatalities and numerous injuries. Particularly notable are 211 crashes where Teslas hit another object head-on, resulting in 14 deaths.
NHTSA’s report also pointed out that many Tesla drivers did not take evasive action quickly enough, even when the obstacle was visible for a considerable time before the crash. This suggests a dangerous overreliance on automation technology.
Comparisons with Industry Standards
The NHTSA report criticized Tesla for deviating significantly from industry norms. While other automakers employ terms like “assist” to denote their systems’ supportive role, Tesla’s choice of “Autopilot” might suggest to drivers that less attention is necessary, leading to hazardous complacency.
Legal and Safety Concerns Mount
Further complicating matters, both California’s attorney general and the state’s Department of Motor Vehicles are investigating the EV giant for potentially misleading marketing and branding related to its driver-assistance technologies.
These actions underscore growing legal and safety concerns surrounding the brand’s ambitious push toward autonomy.
Tesla responded to these concerns with a voluntary recall and an over-the-air update intended to enhance Autopilot’s safety features. However, skepticism remains about the adequacy of these measures, with NHTSA initiating another investigation into the recall’s effectiveness.
$TSLA Tesla Autopilot recall probed by safety regulator following new crashes.
Tesla is facing another setback with its Autopilot software, a system that CEO Elon Musk is betting on to power his robotaxi future.
Tesla’s problems with its Autopilot driver assistance software… pic.twitter.com/Vsm09hFWQW
— Canadian Jennifer 🇨🇦 (@cdntradegrljenn) April 27, 2024
Musk’s Vision Contrasts with Regulatory Warnings
Amid these investigations, Elon Musk continues to advocate for the safety and future potential of autonomous Tesla vehicles. In a recent earnings call, Musk asserted that autonomous vehicles could significantly reduce accident rates compared to human drivers.
Yet, NHTSA’s findings suggest a gap between this vision and the current reality of the brand’s technology.
A Call for Caution and Transparency
As Tesla pushes forward with plans to introduce a fully autonomous robotaxi later this year, the contrast between Musk’s optimistic projections and the stark warnings from safety regulators could not be more pronounced.
For current and prospective Tesla drivers, the message is clear: Stay vigilant and informed about the capabilities and limitations of Autopilot and Full Self-Driving systems. As these technologies evolve, so too must our understanding and regulation to ensure that safety truly comes first.