Tesla's Autopilot Under Scrutiny as US Regulators Uncover 13 Fatal Crashes

https://icaro.icaromediagroup.com/system/images/photos/16186248/original/open-uri20240426-18-5ku9gp?1714165775
ICARO Media Group
News
26/04/2024 20h58

In a significant development, the National Highway Traffic Safety Administration (NHTSA) revealed on Friday that its investigation into Tesla's Autopilot had uncovered at least 13 fatal crashes in which the feature was involved. The NHTSA also found discrepancies between Tesla's claims about Autopilot and the reality of its performance.

During the three-year investigation launched in August 2021, the NHTSA identified 13 Tesla crashes resulting in one or more deaths, as well as numerous incidents causing serious injuries. The agency noted that these accidents were partly attributed to "foreseeable driver misuse of the system." Furthermore, the investigation revealed that Tesla's weak driver engagement system was inadequate for Autopilot's operating capabilities, leading to a critical safety gap.

Of particular concern for the NHTSA was the possibility that Tesla's Autopilot name may mislead drivers into believing that the automation has greater capabilities than it actually does, resulting in a tendency to overly rely on the technology.

In response to these findings, Tesla initiated its largest-ever recall in December 2022, encompassing approximately 2.03 million vehicles in the United States. The recall aimed to increase drivers' attentiveness while using Tesla's advanced driver-assistance system. However, regulators were not convinced of the effectiveness of Tesla's efforts and subsequently opened a second investigation to assess the adequacy of the recall's new Autopilot safeguards.

The second investigation by the NHTSA was prompted by crash events that occurred after the recalled vehicles had received the software update, as well as preliminary test results from the agency. The investigation covers Tesla models Y, X, S, 3, and Cybertruck manufactured between 2012 and 2024 and equipped with Autopilot.

While Tesla has issued software updates to address the concerns raised by the NHTSA, these updates have not been included in the recall or officially considered a remedy for the identified safety risks. Additionally, the NHTSA highlighted Tesla's statement regarding an optional portion of the remedy that allows drivers to reverse certain safeguards, raising further concerns.

Consumer Reports, a non-profit organization evaluating products and services, shared similar apprehensions about Tesla's recall, stating that it failed to adequately address the safety concerns raised by the NHTSA. The organization urged the agency to demand stronger actions from the automaker and criticized the recall for only addressing minor inconveniences instead of resolving the underlying problems.

Tesla's Autopilot is designed to enable vehicles to steer, accelerate, and brake automatically within a lane. While enhanced Autopilot aids with changing lanes on highways, it does not make the vehicles fully autonomous. One component of Autopilot, known as Autosteer, maintains a set speed or following distance and works to keep the vehicle within its driving lane.

The NHTSA's investigation has highlighted the need for further measures to ensure drivers remain engaged when utilizing Autopilot. Previous investigations by the agency have revealed over 40 instances where Tesla vehicles, using driver-assist systems such as Autopilot, were implicated in crashes. To date, 23 crash deaths have been reported.

As part of the recall, Tesla plans to enhance visual alerts, deactivate Autosteer if drivers do not respond to inattentiveness warnings, and implement additional checks when engaging Autosteer. The company also mentioned its intention to impose a one-week restriction on Autopilot use in cases of significant improper usage.

It should be noted that in October 2022, Tesla disclosed that it had received subpoenas from the US Department of Justice related to its Full Self-Driving (FSD) and Autopilot features. Furthermore, in February 2023, Tesla recalled 362,000 vehicles in the United States to update its FSD beta software after the NHTSA expressed concerns that the vehicles were not adhering adequately to traffic safety laws, potentially leading to crashes.

Tesla has not yet responded to requests for comment on this latest development. The ongoing investigations and concerns raised by regulators emphasize the need for continued scrutiny and improvements in the safety performance of Tesla's Autopilot system.

The views expressed in this article do not reflect the opinion of ICARO, or any of its affiliates.

Related