NEW: Get Updates by Email

Tesla Autopilot Takes New Heat – And California Says the Name is Part of the Problem

by | December 12, 2023

Tesla is coming under more intense fire as a result of a series of crashes involving its Autopilot system. The NTSB wants the automaker to block its use on roads where it’s not safe to operate. And California regulators say the name, Autopilot, is itself false and misleading and should be changed.

Fatal Tesla Autopilot Crash in Florida

A dashcam caught the moments before a Tesla ran across a T-intersection and slammed into a parked Chevy Tahoe, killing one and seriously injuring another. (Images of crash courtesy: Washington Post.)

For many owners, Tesla’s Autopilot system was one of the biggest reasons they bought one of the automaker’s EVs. But the semi-autonomous technology is coming under increasing fire from regulators and safety advocates.

The technology is now the target of multiple state and federal investigations, along with civil lawsuits filed by owners or survivors of those killed in crashes. The National Transportation Safety Board wants new rules that restrict the use of Autopilot on roads where it was not designed to operate. And California’s Department of Motor Vehicles contends Tesla falsely advertised the capabilities of both Autopilot and Tesla’s more advanced Full-Self Driving systems.

Death toll mounts

Autopilot has now been directly linked to dozens of crashes resulting in fatalities or serious injuries. The National Highway Traffic Safety Administration currently has multiple investigations underway. Among other things, the agency is looking at why vehicles operating under the system have crashed into stationary emergency vehicles.

Tesla Model S crash California Feb 2023

Teslas operating under Autopilot have been involved in a number of crashes involving fatalities and serious injuries.

Meanwhile, “At least eight fatal or serious Tesla crashes occurred on roads where Autopilot should not have been enabled in the first place,” the Washington Post reported over the weekend. That includes one incident on Key Largo, Florida where a Tesla shot across a T-intersection at 70 mph, striking a Chevy Tahoe. The driver, Dillon Angulo, and his passenger, Naibel Benevides Leon had parked off the road to look at the stars when they were also hit. He was seriously injured, she was killed.


Leaving it up to drivers

Autopilot, according to Tesla’s own website and instructions, should not be used completely hands-free. And it should only operate in specific conditions, such as divided, multi-lane highways.

“In user manuals, legal documents and communications with federal regulators, Tesla has acknowledged that Autosteer, Autopilot’s key feature, is ‘intended for use on controlled-access highways’ with ‘a center divider, clear lane markings, and no cross traffic.’” The Post wrote. The paper also noted that Autopilot could have problems on hilly or sharply curvy roads.

What has concerned regulators is that Tesla continues to allow Autopilot to be used on roads for which it is not intended. Following a fatal crash in 2018, the automaker told the NTSB that “the driver determines the acceptable operating environment.” That’s in sharp contrast to what key competitors do.

The competition

NTSB Chief Jennifer Hamendy

NTSB Chief Jennifer Homendy has taken a rare shot at NHTSA for failing to crack down on where Autopilot can be used.

Ford, with its BlueCruise system, and General Motors, with Super Cruise, actually go a step beyond Autopilot, allowing full hands-free operation in carefully “geofenced” locations and under optimum driving conditions. Otherwise, those technologies will either not start up, or they will automatically advise a driver to retake control.

Until recently, Tesla had few restrictions on the use of both Autopilot and Full-Self Driving. It’s only begun using in-cabin technology to make sure that drivers retain eyes on the road, as Ford and GM do. And that approach has not been retrofitted into older Teslas. So, in the Florida crash that killed Benevides Leon, the driver of the Tesla acknowledged taking his eyes off the road – missing flashing lights and warning signs – while he reached for the cellphone he’d dropped.

Regulatory debate

The fact that drivers can activate Autopilot anywhere has been a subject of intense debate since shortly after another Florida crash in 2016 where Joshua Brown was killed after his Model S drove into the side of a semi-truck. The NTSB and NHTSA both investigated and it was eventually determined that while Autopilot failed to respond properly, Brown also caught blame as he was watching a video instead of focusing on the road.

The NTSB can only investigate crashes. It can’t take regulatory action. It followed up by calling for NHTSA to do so. But, at this point, that agency has yet to put any rules in place.

In an October interview with the Post, NTSB Chair Jennifer Homendy called that “a real failure of the system.” She said NHTSA needs to “stand up” on behalf of the public, but said that “safety does not seem to be the priority when it comes to Tesla.”

More Tesla News

Autopilot isn’t

Even as investigators examine the causes of numerous recent crashes linked to Autopilot, the California DMV has turned the heat up on Tesla in a different manner. It’s contesting the use of both the Autopilot and Full Self-Driving names which, regulators say, improperly describes what the Tesla technologiesy can do – contending it amounts to false advertising.

The DMV is looking at various actions it could take, ranging from fines all the way up to revoking Tesla’s license to sell vehicles in California, it’s single largest market in the U.S.

Tesla continues to defend itself and, among other things, says state regulators had the opportunity to take action back in 2014 when the use of the name, Autopilot was first investigated. It failed to take action, again, in 2017, when other terms were investigated.

“The DMV chose not to take any action against Tesla or otherwise communicate to Tesla that its advertising or use of these brand names was or might be problematic,” Tesla said in a December 5 filing with the California Office of Administrative Hearings.

What Did Tesla Know?

Tesla has faced several civil lawsuits regarding Autopilot this year. In the latest, a Florida judge last month ruled there was “reasonable evidence” that the automaker and CEO Elon Musk knew about defects with the system. He has allowed the case, involving a fatal crash near Miami in 2019, to proceed to trial.

The automaker did not respond to queries from Headlight.News regarding this story. It has disbanded its communications department and seldom responds to journalist queries.


Submit a Comment

Your email address will not be published. Required fields are marked *

Our Mailing List is Live!
Get Updates by Email

Get on our list to receive the latest automotive news in your inbox!

Invalid email address
I would like to receive:
Give it a try. You can unsubscribe at any time.

Share This