NEW: Get Updates by Email

Tesla Recalls 2 Million Vehicles Over Autopilot Safety Concerns

by | December 13, 2023

Tesla will recall 2 million EVs sold in the U.S. to make significant updates to its Autopilot system. As Headlight.News reported on Tuesday, the semi-autonomous technology has come under fire because, among other things, it can be used on roads for which it was not designed. That has been linked to numerous fatal crashes. But Tesla put a positive spin on things, CEO Elon Musk claiming the recall is part of a “moral obligation” to ensure the safety of its vehicles.

Elon Musk 2021

Tesla CEO Elon Musk said recalling the 2 million vehicles was the company’s “moral obligation.”

Tesla will install new software safeguards on 2 million of its vehicles equipped with the Autopilot system. The largest recall in its history will cover virtually all of the EVs the Texas-based automaker has sold in the U.S.

As Headlight.News previously reported, Tesla has come under increasing fire for alleged problems with Autopilot, triggering a number of state and federal safety probes. The recall aims to address concerns that the semi-autonomous system can be activated even when a vehicle is not operating on roads for which it was designed, A report by the Washington Post last weekend linked that issue to “at least eight fatal or serious Tesla crashes.”

An ongoing debate

The recall was formally announced by the National Highway Safety Administration Wednesday morning.

The agency, which is the primary overseer of federal safety regulations, had itself been coming under increased pressure, and not just from independent safety advocates.

Tesla Model S crash California Feb 2023

Teslas operating under Autopilot have been involved in a number of crashes involving fatalities and serious injuries.

In an October interview with the Post, Jennifer Homendy, the chair of the National Transportation Safety Board, directly called out NHTSA for failing to enact new regulations governing the operation of semi-autonomous software, Autopilot, in particular. She called that “a real failure of the system.” She said NHTSA needs to “stand up” on behalf of the public, but said that “safety does not seem to be the priority when it comes to Tesla.”

Leaving it up to the driver

Tesla isn’t the only automaker adding semi-autonomous features to its vehicles. General Motors, with Super Cruise, and Ford, with BlueCruise, currently offer features that go a step further, allowing full hands-free operation on specific roadways. But they use “geofencing” that won’t allow those technologies to operate except when on roads for which they are designed — most typically limited-access and divided highways. Both systems also turn off under bad weather conditions.

Tesla, by contrast, has long held to the position that “the driver determines the acceptable operating environment,” as it advised the NTSB following a fatal 2018 crash. Tesla long resisted any additional steps that could limit the use of Autopilot. That included anything designed to ensure drivers remain ready to take full control of a vehicle using Autopilot in an emergency. Such a feature has been part of both Super Cruise and BlueCruise from launch.

A Florida judge ruled Tesla officials, including CEO Elon Musk, likely knew there was a problem with Autopilot and failed to act.

As part of the new recall, Tesla acknowledged that the way Autopilot currently functions it  “may not be sufficient to prevent driver misuse,” potentially increasing the risk of an otherwise preventable crash.

A ”moral obligation”

NHTSA officials, in a statement announcing the recall, said they’d found problems with “’the prominence and scope’ of controls and warnings designed to prevent misuse of Autopilot,” Reuters reported.

In reversing course, Tesla CEO Elon Musk put a positive spin on the recall. In a statement on X, the former Twitter, he said the automaker had a “moral obligation” to move forward with the development of autonomous technologies.

The South African-born executive repeated his prior assertion that vehicles using Autopilot are 10 times safer than those running without the technology.

NHTSA continues its probe, but sees an upside

Tesla crash Contra Costa County Feb 2023

Tesla’s Autopilot’s been linked to more than three dozen crashes since its creation.

“Safety metrics are emphatically stronger when Autopilot is engaged than when not engaged,” Musk said in his tweet.

For its part, NHTSA cautioned that its ongoing investigation of the technology “remains open as we monitor the efficacy of Tesla’s remedies and continue to work with the automaker to ensure the highest level of safety.”

But it’s statement found an upside to Autopilot and similar technologies, adding, “Automated technology holds great promise for improving safety but only when it is deployed responsibly; today’s action is an example of improving automated systems by prioritizing safety.”

A mixed record

While some proponents agree that technologies like Autopilot can help prevent crashes, the Tesla system has generated significant concerns since it was first released.

NHTSA has launched a series of investigations since then, most recently a review of 956 crashes that occurred while Autopilot was reportedly in use. In particular, it began in August 2021 looking into a series of incidents where Tesla products struck stationary emergency vehicles. That probe was upgraded a year later as further reports came in.

Tesla’s Autopilot technology has been under close examination for several years.

The new recall will only involve updating software, and it appears Tesla will be able to accomplish that using built-in, smartphone-style over-the-air update technology, rather than forcing owners to come to its service facilities.

Autopilot isn’t out of the woods

How the recall will be received is uncertain. For Tesla, it means NHTSA will continue to permit the use of Autopilot and the sale of the technology — as well as the upgraded Full Self-Driving system — in its products.

But those features could face additional challenges. The U.S. Justice Department issued subpoenas in October looking into the claims that Autopilot and FSD make it possible for Tesla vehicles to drive themselves.

Separately, the California Department of Motor Vehicles continues its own investigation looking into whether Tesla “fraudulently” markets the two systems. The DMV could levy fines against the automaker or even ban it from selling vehicles in California, its largest market in the U.S.


Submit a Comment

Your email address will not be published. Required fields are marked *

Our Mailing List is Live!
Get Updates by Email

Get on our list to receive the latest automotive news in your inbox!

Invalid email address
I would like to receive:
Give it a try. You can unsubscribe at any time.

Share This