Tesla Autopilot: Elon Musk is fully behind it – but the accidents are increasing

Pile-up in San Francisco
Elon Musk relies fully on the autopilot – but it causes serious accidents more and more often

A Tesla Model 3 suddenly braked on the Bay Bridge – a total of eight vehicles were involved in the ensuing accident.

© Dylan Stewart / Picture Alliance

In November, a serious accident involving eight vehicles occurred in the USA. It was caused by a Tesla that behaved erratically. Now a video shows what happened – and makes a problem clear.

At the end of November, a serious accident occurred on the Bay Bridge in San Francisco, USA. The trigger was a Tesla that suddenly not only changed lanes, but shortly thereafter braked quite abruptly. Then there was a pile-up. The driver told authorities he was not driving the maneuver himself but was being chauffeured by Tesla’s Full Self Driving (FSD) mode at the time of the accident. After all, it was the software that initiated the stopping process for inexplicable reasons.

This statement has not yet been confirmed, but it fits a behavior pattern of the vehicle that other Tesla customers have already complained about.

The case quickly ended up in the reports of American television stations, but without recordings of the accident. the magazine”The Intercept” has now managed to get the accident report and recordings of the California Highway Patrol via an official request from the authorities.

Accident video shows unpredictable behavior of Tesla

The video confirms the driver’s description of how the accident happened. The police report states that “Full Self Driving” was apparently not working correctly at the time of the accident – at least that’s what the driver said during the interrogation. In the Highway Patrol report, the official states that he cannot verify this information.

It is said that on the day of the accident the car was using version 11 of the current FSD beta. A Tesla with this software is expressly not classified as an “autonomous vehicle”, which is why the driver is accused of not having acted immediately when the malfunction occurred. Tesla also writes in the small print that the “autopilot” function “does not turn the vehicle into a self-driving car”.

The braking maneuver is obviously not an isolated case. The “Washington Post” reported back in February 2022 that more and more complaints about sudden braking with active FSD were being received by the federal agency. The phenomenon, known in the USA as “phantom braking” (phantom braking), was already known to numerous people before the accident in San Francisco Problems raised, with the report reporting 107 complaints in just three months.

The timing of the accident was extremely unfavorable for the automaker. Just a few hours earlier, Tesla boss Elon Musk wrote on Twitter that “Full Self Driving” is now available in North America for anyone who has booked the function for their car. Previously, only selected people had access to it.

Tesla is responsible for hundreds of accidents

For the US National Highway Traffic Safety Administration (NHTSA), that’s just one more investigation out of hundreds. In the period between July 2021 and June 2022, “Autopilot” – or “Full Self Driving” – is associated with 273 accidents. According to the authority, 70 percent of 329 accidents related to advanced driver assistance systems were attributable to Tesla, according to another report by “Washington Post“.

Incidents related to Tesla’s “Full Self Driving” mode keep occurring in the United States. Sometimes it is also referred to as “Autopilot”, depending on its range of functions. These designations aroused the expectation in people that they no longer had to pay attention to traffic. It was not until the end of December 2022 that the state of California banned the company from using the apparently misleading product names.

Elon Musk wants to abolish hand-on-the-steering wheel requirement

For Tesla, it’s about more than just broken vehicles. in one interview Elon Musk said: “For Tesla, mastering full self-driving makes the difference between a very high company value and worthlessness.” Musk keeps promoting the autopilot and FSD, and the company usually downplays accidents and errors or refers to incorrect operation.

There doesn’t seem to be any improvement in sight unless the authorities intervene. On the contrary. In late December, a user asked Elon Musk on Twitter if it wasn’t possible to turn off the constant warnings that remind drivers to keep their hands on the wheel. The suggestion: Anyone who has driven 10,000 miles with FSD should be allowed to switch it off. Musk replied: “I agree, we’ll have an update for that in January.” So far it has not been published.

In early January, Tesla released the FSD update to version 2022.44.30.5, which introduced penalties for inattentive drivers. Anyone who takes their hands off the steering wheel too often will be blocked from using the function for up to two weeks. Only he knows how this can be reconciled with the boss’s statements.

source site-6