Tesla and Elon Musk are said to have accepted errors in Autopilot

Elon Musk
Elon Musk suffers defeat in court: Tesla is said to have known about autopilot errors

Tesla CEO Elon Musk probably has to prepare for a lawsuit

© Kirsty Wigglesworth/AFP

For years there have been disputes surrounding the safety and reliability of Tesla assistance systems, also known as autopilot or FSD. A judge has now ruled that Elon Musk and his team apparently knowingly ignored errors in the software.

The Tesla Autopilot’s name suggests that it is able to drive a vehicle safely and without assistance. The better “FSD package” (“Full Self-Driving”) even more so. In Germany it is called “full potential for autonomous driving”. Countless practical examples show that a Tesla should never be driven inattentively – and the law requires it.

In January, test drivers of the “New York Times Magazine” reveals glaring deficiencies – the US Federal Highway and Vehicle Safety Authority (NHTSA) is also investigating the software’s problems in several cases.

“Sufficient evidence” that Musk and Tesla allegedly knew

After a few fatal accidents, there were understandably complaints from the survivors. And how “Reuters” reports, there has now been a preliminary decision in one case: According to this, Judge Reid Scott in the US state of Florida found “sufficient evidence” that Tesla boss Elon Musk and other managers allegedly knew that the vehicle systems had errors , which could pose a danger in certain situations.

The lawsuit was filed by the wife of a deceased man whose 2019 Tesla Model 3 drove under the trailer of a shunting heavy truck for reasons that were initially unclear. The roof was severed and the driver was fatally injured.

This is a big problem for Tesla and Musk, because the plaintiff can now go to court over the fatal accident and assert claims for damages against Tesla for intentional misconduct and gross negligence.

What’s more: According to Bryant Walker Smith, a law professor at the University of South Carolina, the ruling “opens the door to a public trial in which the judge appears inclined to allow a lot of testimony and other evidence favorable to Tesla and could make his CEO quite unpleasant,” he told Reuters.

Richter saw parallels to other Tesla accidents

Judge Scott apparently saw similarities with another crash that occurred back in 2016. In that death, too, a truck trailer was incorrectly identified.

The Tesla autopilot is repeatedly said to have driving errors, which are reflected in the report by “New York Times Magazine” could even be reproduced. Vehicles with bright lights, be they emergency vehicles or traffic lights, had particular problems.

“It would be plausible to conclude that Tesla was aware of the problem through its CEO and engineers,” the judge wrote, according to Reuters.

Tesla and Musk often promoted the high level of safety of the vehicles

Tesla, on the other hand, has often advertised the systems, especially Autopilot and FSD, with the promise that the vehicles would be able to drive themselves. For example, at the beginning of such an advertising message there is a disclaimer stating that the person in the driver’s seat is only present for legal reasons.

If Tesla loses in this case, the decision could trigger a chain reaction. It was only reported in October that “Handelsblatt” from a lawsuit by two US pension funds that invested in the company in 2019 and now believe that the alleged falsehoods are responsible for price losses.

Since these proceedings are a class action lawsuit that other people can join, the amount of possible compensation for investors cannot yet be predicted – but it could have a serious impact on Tesla if a judge rules against the company.

Due to personal claims by Elon Musk, who has repeatedly promoted the safety of his vehicles, he himself is also the target of this lawsuit.

source site-6