A judge in Florida cited “reasonable evidence” that Tesla and its CEO Elon Musk knew the automaker’s Autopilot system had a potentially fatal defect, a finding that could create serious problems for Tesla as it faces the third trial this year over the semi-autonomous driving system.
A Florida Circuit Court judge ruled Tesla will face trial over claims that a defect involving its Autopilot system contributed to a fatal 2019 crash near Miami.
In his ruling, Judge Reid Scott, of the Circuit Court for Palm Beach County, cited “reasonable evidence” that Tesla, CEO Elon Musk and other executives were aware of potential defects with the automaker’s Autopilot system. The plaintiffs in the case had accused Tesla of knowingly allowing unsafe vehicles to be driven on public roads.
Reid noted the 2019 accident, which killed Tesla Model 3 owner Stephen Banner, was “eerily similar” to another fatal Florida crash in 2016. That accident occurred when the Autopilot in former Navy Seal Joshua Brown’s Model S failed to detect a semi-truck crossing its path.
“Reasonable evidence”
“It would be reasonable to conclude that the Defendant Tesla through its CEO and engineers was acutely aware of the problem with the ‘Autopilot’ failing to detect cross traffic,” Reid wrote in an opinion allowing the latest lawsuit go to trial.
He also cited a widely seen Tesla video released in 2016 in which a Model S was shown driving without human intervention. The video stated “The car is driving itself,” and included a disclaimer declaring that while there was a person behind the wheel, he was there solely for legal reasons.
Tesla’s website also contains a disclaimer stating that a human driver must maintain at least a light grip on the wheel at all times. But the automaker has come under frequent fire for what critics claim is an effort to overstate the capabilities of Autopilot, as well as the more advanced Full Self-Driving system. In fact, it does not allow the vehicle to drive itself, though CEO Musk has repeatedly promised an upgrade allowing it to do so will soon be released. Such an update has now been promised for several years.
The Banner case “opens the door for a public trial”
In the case set to go to trial, Tesla Model 3 owner Banner was traveling north of Miami when his vehicle drove underneath an 18-wheeler that turned in front of it. The roof of the vehicle was sheared off and Banner was killed instantly. Banner’s widow claimed the electric sedan was operating in Autopilot mode at the time of the crash but it failed to brake or take other evasive maneuvers to avoid the collision.
Tesla attempted to dismiss the claims made in the lawsuit by, among other things, asserting, “There are no self-driving cars on the road today.”
Several key counterpoints were raised by the judge in his decision to permit the case to go to trial. First, he indicated Tesla has taken steps that could lead a driver to think Autopilot can be driven hands-free. Secondly, he raised concerns that not only were there defects in the Autopilot system but also Tesla and its CEO knew about them but failed to take appropriate action.
“This opinion opens the door for a public trial in which the judge seems inclined to admit a lot of testimony and other evidence that could be pretty awkward for Tesla and its CEO,” Bryant Walker Smith, a University of South Carolina law professor, told Reuters. “And now the result of that trial could be a verdict with punitive damages.”
Autopilot under fire, federal investigation
Whether those and other arguments made by the plaintiffs will hold during a jury trial is yet to be seen. But Tesla this year prevailed in two other trials in California centered around crashes involving Autopilot.
In one of those cases, the family of Model 3 owner Micah Lee cited product defects for a 2019 crash that killed Lee and seriously injured two passengers on a highway near Los Angeles. According to police reports, Lee’s vehicle suddenly veered off the road, struck a palm tree and burst into flames. One of those passengers, an 8-year-old boy, was disembowled.
Tesla’s various semi-autonomous technologies have come under increasing scrutiny as the number of crashes, deaths and injuries continue to rise. In a number of instances, its vehicles have struck stationary emergency vehicles with no warning. The National Highway Traffic Safety Administration currently has open a number of investigations involving Autopilot and FSD.
0 Comments