NewslettersTesla Says Full Self-driving Beta May Cause Crashes, Recalls Over 360,000 Vehicles

February 20, 20230

Tesla recalled 362,758 vehicles last week, warning that its Full Self-Driving (FSD) Beta — pricey experimental driver-assistance software — may cause crashes. An over-the-air software update will be delivered by Tesla for cars equipped with this system.

The National Highway Traffic Safety Administration (NHTSA) issued this notice last Thursday, specifying how these accidents could occur.

The FSD Beta system may allow the vehicle to make unsafe maneuvers in intersections, including:

  • traveling straight through an intersection while in a turn-only lane,
  • entering a stop sign-controlled intersection without coming to a complete stop, or
  • proceeding into an intersection during a steady yellow traffic signal without due caution.

On top of that, the system may insufficiently respond to changes in speed limits or not adequately account for the driver’s speed adjustment to exceed speed limits.

For firm co-founder and personal injury attorney Miguel Custodio, this news does not come as a shock. This isn’t the first recall we’ve seen from Tesla addressing this issue, and it likely won’t be the last. As Miguel noted in his June 2022 column for the Daily Journal, “Many of these recalls stem from a software update that Tesla could quickly disable or address with another update – meaning in most cases there is no actual physical recall.” These recalls are more of a slap-on-the-wrist for the company, and likely don’t discourage it from introducing similar software in the future.

Raising the issue of Tesla naming their partially automated systems “Autopilot” and the consequences of doing so, Miguel’s column spotlights the deadly 2019 Tesla crash in Gardena.

Los Angeles resident Kevin Riad was charged last year with two counts of vehicular manslaughter in connection with the crash. Tesla’s Autopilot feature was engaged when the car ran a red light at 74 mph and killed two people, making this the first criminal case against a driver who was using a partially automated driving system at the time of a fatal crash.

“In California, automaker liability essentially boils down to what the customer was seemingly promised. Tesla may be held liable, for example, if it is found that Riad’s overconfidence in the autonomous driving function was directly attributed from Tesla advertising that overstated the car’s self-driving capabilities,” Miguel pointed out.

Just as we can infer that Tesla won’t stop issuing what it prefers to call “soft recalls” to address serious issues, it’s likely that this will not be the last criminal case we see in connection to accidents involving partially automated driving systems.

As we saw over the weekend in Walnut Creek, it isn’t far-fetched to consider the potential role of Autopilot in Tesla crashes. Early Saturday morning, the driver of a Tesla was killed after the car crashed into a parked firetruck on Interstate 680.

While the cause of the accident is still unclear, the report noted that about 14 Teslas have crashed into emergency vehicles while on Autopilot. NHTSA is currently investigating Tesla’s Autopilot to see how it detects and responds to emergency vehicles parked on freeways.

Tesla’s recall of the FSD Beta is a temporary solution to a much larger problem that stems from its Autopilot software. As Miguel says, “I think autonomous vehicles are a revolutionary technology with the potential to transform transportation as we know it. But the consequences will be costly if we once again allow our laws to lag significantly behind the technology they dictate.”

To see if your Tesla is involved in this recall, visit the NHTSA’s website.

If you or someone you know was injured in an accident involving partial autopilot technology, contact the skilled personal injury lawyers at Custodio & Dubey LLP. With over 25 years of experience, our lawyers will guide you at every step of the way to help you receive the justice you deserve.