NEW: Get Updates by Email

Tesla Autopilot Has Critcal Safety Gap, Federal Investigation Links Feature To Hundreds Of Crashes

by | April 26, 2024

NHTSA closes three year investigation into Tesla’s Autopilot feature after the company announced a recall in December. Agency states the system has critical safety gap that contributed to hundreds of incidents and a growing tally of fatalities.

Tesla Model 3 Florida crash Aug 2021

Tesla is facing new scrutiny after the National Highway Transportation Agency closed a three-year investigation into Autopilot.

The National Highway Transportation Agency (NHTSA) revealed in an announcement Friday that it has closed a three-year investigation into Tesla’s Autopilot feature which revealed that the feature has a “critical safety gap” that has played a key role in many of these crashes. 

In addition, the agency also announced a new probe into whether a voluntary recall that the company announced back in December was sufficient enough to address some of the problems that emerged with Autopilot. 

 

NHTSA investigation confirms critical safety gap 

Tesla’s Autopilot technology has been under close scrutiny for several years.

In its report, the agency revealed that Autopilot has a “critical safety gap” in its fundamental design that “led to foreseeable misuse and avoidable crashes.” The report goes on further stating that the system did not “sufficiently ensure driver attention and appropriate use.” 

A quick trip through Google appears to confirm this portion of the NHTSA’s report with multiple videos online showing Tesla owners doing a variety of unsafe things behind the wheel when Autopilot is active (including explicit acts) with the system appearing to still provide autonomous driving capabilities even when the driver is visibly distracted.

This safety gap has led to a growing number of incidents with the NHTSA confirming that Autopilot contributed to 467 collisions with 13 of them resulting in fatalities and a massive number of serious injuries. 

Agency confirms new probe into Tesla recall 

Fatal Tesla Autopilot Crash in Florida

Autopilot has been linked to over 460 crashes and a growing number of fatalities

In addition to concluding its long-running investigation, the NHTSA also revealed that it is launching a new probe into the effectiveness of a voluntary recall in December that was supposed to address many of these concerns. The recall came in the form of a software update that Tesla released for 2 million affected vehicles and was supposed to improve the driver monitoring software for Autopilot. 

The agency indicated that these updates were probably not effective enough with reports of Autopilot-related crashes still being documented even after the update was sent out to owners. The findings and the new probe are the latest in a long list of reports from regulators and watchdog groups that have all questioned the safety of Tesla’s Autopilot system. 

Tesla, for its part, still markets Autopilot as a key differentiator that makes its vehicles different from rivals and also didn’t issue a response to the NHTSA’s actions. 

 More Tesla Stories

Political leaders sound off, social media sites decline comment

Tesla Buffalo Gigafactory 2 exterior REL

Tesla is laying off 285 workers at two plants in Buffalo, New York. the move is part of a larger reduction.

Unlike Tesla, several political figures were quick to issue statements after the report was released with Sens. Edward J. Markey, D-Mass., and Richard Blumenthal, D-Conn., issuing a joint statement calling for Tesla to restrict Autopilot “to the roads that it was designed for.” The two senators also called for Tesla to take all necessary actions to prevent these vehicles from endangering lives. 

We also reached out to several websites that feature unsafe Autopilot activities on their respective platforms prior to the publication of this story to hear how they would discourage people from posting this type of content. Google, Facebook, and Tiktok all declined to issue a statement which is concerning since some of the unsafe Autopilot activities posted on social media often come from people trying to replicate influencers and other social media celebrities that post similar content on their channels. This trend casts a large spotlight on Autopilot’s problems and could create more crashes and fatalities as a result.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *


Our Mailing List is Live!
Get Updates by Email

Get on our list to receive the latest automotive news in your inbox!

Invalid email address
Give it a try. You can unsubscribe at any time.

Share This