Federal Safety Officials Investigate Tesla's Full Self-Driving Software Following Fatal Crashes
ICARO Media Group
### Federal Safety Officials Probe Tesla Crashes Involving Full Self-Driving Software
Federal car safety regulators have initiated a new investigation into Tesla's Full Self-Driving (FSD) software, following reports of four accidents involving the system. One of these crashes resulted in the death of a 71-year-old pedestrian in Arizona in November 2023. The inquiry is set to evaluate whether the driver-assistance system is secure under low-visibility conditions.
The National Highway Traffic Safety Administration (NHTSA) has long been scrutinizing Tesla's driver-assistance technologies, which have been implicated in various accidents and fatalities. This new probe will specifically examine how the FSD software performs in challenging visibility scenarios. The findings could put to test Tesla CEO Elon Musk's stance that cameras alone are sufficient for navigation, without relying on additional sensors like radar or lidar.
This time, the NHTSA's scrutiny extends to Tesla's more advanced FSD system, which is advertised as capable of navigating vehicles independently with minimal human intervention. The agency stated the investigation would assess the system's ability to detect and disengage when it can't function adequately and its efficiency in minimizing risks.
The fatal Arizona accident occurred on Interstate 17, about an hour and a half north of Phoenix. According to NHTSA records and the Arizona Department of Public Safety, the incident involved a Tesla Model Y crashing into the pedestrian who was attempting to direct traffic following an earlier collision between two other vehicles. Poor visibility on the roadway was cited as a factor, although it's unclear from the accounts how the visibility was compromised prior to the incident.
Tesla's FSD system is available in about 2.4 million vehicles across the United States, although not all owners have purchased the software package. As of the first quarter of this year, approximately half of the Tesla owners were using the FSD software, a figure which continues to grow. The FSD package costs $8,000 or can be subscribed to for $99 a month.
Both Autopilot and Full Self-Driving require drivers to remain vigilant and ready to intervene, despite their advanced capabilities. Autopilot, a standard feature in most Tesla models, offers sophisticated cruise control and lane-keeping on well-marked roads, while FSD extends these capabilities to navigating city streets, changing lanes, and handling stop lights.
The four reported incidents all involved Tesla vehicles operating under poor visibility conditions due to glare, fog, or dust, with the FSD system engaged. Two of these crashes resulted in injuries. The investigation encompasses Tesla Models S and X from 2016 to 2024, Model 3s from 2017 through 2024, Model Y from 2020 through 2024, and Cybertruck vehicles from 2023 or 2024.
Matthew Wansley, an expert in emerging automotive technologies at Yeshiva University's Cardozo School of Law, noted that Tesla's focus on camera-based systems, while sidelining other sensors, has always been atypical. He mentioned that other sensors like radar and lidar are beneficial in poor visibility conditions, a stance disregarded by Elon Musk who has famously dismissed lidar as unnecessary.
NHTSA's authority mainly lies in investigating safety problems and ordering necessary fixes post-sale, rather than pre-sale technology reviews. The agency has collected data on the Tesla crashes under a 2021 mandate requiring automakers to report incidents involving advanced driver-assistance technologies.
An incident in Virginia involved a Tesla Model 3 rear-ending a Ford Fusion on Interstate 81 in March, due to the Ford's mechanical issues leading it to travel slowly. The Tesla driver sustained minor injuries and was cited for following too closely.
Tesla has previously issued multiple recalls for its FSD software. In 2023, an update aimed to correct instances of vehicles speeding and behaving erratically at intersections. This recall was connected to a prior federal investigation. Meanwhile, Tesla's Autopilot has also been under extensive scrutiny, leading to a recall of over 2 million vehicles in December. Despite these efforts, further investigations have been necessary, as demonstrated by a fatal crash involving Autopilot in April 2023.