Tesla Recalls 362,758 Cars over Full Self-Driving Crash Risk

  • Tesla is recalling 362,758 vehicles due to problems with the full self-driving software that allow vehicles to exceed speed limits or drive through intersections in an illegal or unpredictable manner, according to filings with the National Highway Traffic Safety Administration (NHTSA).
  • The issues affect a range of years throughout the lineup, including certain Model 3, Model X, Model Y and Model S units manufactured between 2016 and 2023.
  • Tesla said it will issue a free over-the-air (OTA) software update for the affected vehicles and will send notification letters to owners by April 15, 2023.

Tesla is recalling hundreds of thousands of vehicles over safety concerns regarding the company’s Full Self-Driving (FSD Beta) automated driving software. The recall affects a total of 362,758 vehicles, including certain Model 3, Model X, Model Y and Model S EVs manufactured between 2016 and 2023.

Filings with NHTSA show that vehicles using the FSD Beta can behave in an unsafe manner, with particular concerns at intersections. Vehicles may drive directly through an intersection while in a turn lane, enter a stop sign-controlled intersection without coming to a complete stop, or drive into an intersection during a steady yellow traffic signal without due care, according to NHTSA Documents . The software may also fail to recognize changes in posted speed limits, and fail to slow the vehicle when entering a slower traffic area.

Tesla will release an over-the-air (OTA) software update for the problem for free. Notification letters for owners are expected to be mailed by April 15, 2023. Owners can contact Tesla customer service at 877–798–3752. Tesla’s number for this recall is SB-23-00-001.

NHTSA’s Office of Defects Investigation opened a preliminary investigation into the performance of FSD. The investigation was motivated by an accumulation of accidents in which Tesla vehicles, operating with Autopilot, struck stationary in-road or roadside first-responder vehicles tending to pre-existing crash scenes, according to NHTSA. The original preliminary evaluation was later upgraded to an Engineering Analysis (EA) to expand on the existing crash analysis, evaluate additional data sets, conduct vehicle evaluations, and to determine the extent to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks. to investigate. by undermining the effectiveness of the manager’s supervision.

ev newsletter sign up