Tesla forced to update more than 300,000 vehicles, its “autonomous” assistant deemed dangerous

A hard blow for Tesla and for its customers. The manufacturer will have to update 362,758 cars in the United States, because the United States Highway Safety Agency (NHTSA) has concluded that the beta of its “Full self-driving mode” (FSD, or “fully autonomous driving capability in France) could cause the vehicle to act in a potentially dangerous way at intersections, with the risk of an accident.

Investors reacted badly to this notice published on Thursday, and the title of the manufacturer of electric vehicles fell 5.69% during the session. The recall covers all models in the line, S, X, Y and Model 3, equipped with the FSD software or planned to receive it, which Tesla charges $15,000.

The FSD, which is in the test phase, is a so-called level 2 software, which is more about driving assistance than autonomous driving despite its name. In fact, it requires the user to keep his hands on the steering wheel and to be ready to intervene.

Dangerous driving

According to the advisory released by NHTSA, software flaws, when enabled, can cause the vehicle to continue straight when it has entered a turn-off lane that theoretically requires a turn. A car with the FSD software engaged can also drive through an intersection with “stop” signs without coming to a complete stop, or drive through an intersection with steady amber lights without slowing down.

Affected vehicles may also “fail to react sufficiently to signaled changes in speed limits” or fail to intervene when a driver exceeds the maximum permitted speed, according to the agency.

To address these defects, reported by NHTSA, Tesla plans to perform a software update, at its own expense, the notice said. At this stage, it is unclear whether Tesla will be able to fix the issues raised or will have to downgrade the software to a less complete version and reimburse users.

Software update

However, this recall does not require you to return your vehicle to a Tesla checkpoint. “The term ‘reminder’ to describe a software update is anachronistic and simply wrong,” Elon Musk tweeted in response to the announcement.

In June 2022, the NHTSA published a report stating that Teslas equipped with driver assistance software, active at some time during the 30 seconds preceding the incident, had been involved in 273 accidents. road in the United States.

The US Department of Justice has opened an investigation into Tesla’s driver assistance systems, according to a document published in late January by the stock market regulator, the SEC.

In the document, Tesla recalled that the FSD and the other software, known as “Autopilot”, were “designed to be used by an alert driver whose hands are on the wheel and who is ready to regain control at any time”. .

But for several years, the manufacturer’s boss, Elon Musk, has regularly gone much further in his statements. From 2019, he thus promised the commissioning, within the year, of a vehicle capable of ensuring completely autonomous driving, without any intervention from a passenger. No vehicle in the range is currently equipped with such software.

In an interview with the CNBC channel in October 2021, the boss of the American Agency responsible for investigating transport accidents (NTSB), Jennifer Homendy, called the use of the term “fully autonomous driving” “misleading”. “.

source site