9

Tesla recalls 11,706 vehicles over Full Self-Driving Beta software bug

 2 years ago
source link: https://arstechnica.com/cars/2021/11/tesla-recalls-11706-vehicles-over-full-self-driving-beta-software-bug/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

time to play by the rules? —

Tesla recalls 11,706 vehicles over Full Self-Driving Beta software bug

The new software bug caused phantom emergency braking events.

Jonathan M. Gitlin - 11/2/2021, 6:06 PM

Tesla is issuing a recall over phantom collision warnings and brake activations.
Enlarge / Tesla is issuing a recall over phantom collision warnings and brake activations.
Aurich Lawson | Tesla

Tesla's controversial "Full Self-Driving" feature took another hit on Tuesday. The Texan automaker issued a recall for nearly 12,000 vehicles after an over-the-air software update introduced a new bug that can cause false activations of the cars' forward collision warning and automatic emergency braking (AEB) systems.

According to the safety recall report, the problem affects Models S, X, and 3 vehicles built between 2017 and 2021 and Model Y vehicles built between 2020 and 2021 that are running firmware release 2021.36.5.2. The updated firmware was rolled out to drivers in its beta testing program on October 23 and, once installed, caused a pair of chips to stop talking to each other when the vehicle wakes up from "sentry mode" or "summon standby mode."

That error prevents the neural networks that operate on one of the chips from running consistently, causing it to throw false-positive collision warnings and—more seriously—false-positive AEB activations.

Tesla acted quickly after unleashing the faulty software. After receiving multiple reports of problems, the company halted the rollout and disabled the two affected safety features on the affected cars by the next day. On October 25, a new firmware version was released, correcting the problem and restoring collision warning and AEB to the affected cars.

Perhaps the most unusual aspect of this story is that Tesla initiated the recall process through the National Highway Traffic Safety Administration for a software issue. Almost all the affected cars have already been patched, and Tesla doesn't often feel the need for such formality.

However, after years of admonishment and pleading by the National Transportation Safety Bureau, the new regime at the NHTSA has begun to apply a little more scrutiny to Tesla and its Autopilot system.

Promoted Comments

  • OK, but when are they going to address the phantom AEB events in all their cars, regardless of FSD status. We rarely use autopilot on our M3 because it's scared us with such an event enough times and we're not willing to risk being rear-ended for no reason.

    I don't care that the other driver will most likely be found responsible. It's a huge headache to deal with it, and parts for Tesla are in very short supply, leading to weeks / a month or two to get stuff fixed.
  • mccross90 wrote:
    Perhaps, just perhaps, Tesla should hire professionally trained safety drivers to beta test their software on a controlled fleet of vehicles rather than have the public do it for them.
    Even more important in creating safe self-driving software is adopting an industry standard automotive software safety standard, such as ISO 26262. Life-critical software needs to be developed using best practices and tested rigorously. Short version: safety-critical software can't be developed using an app model.

    Currently, I'm a lead engineer on an aviation software project certifying to DO-178C DAL A. (Translation: if our code is buggy, we can lose an aircraft and its occupants.) We document, review, and verify all of our software design and implementation. We test the hell out of it, from high-level integration testing to low-level testing, while achieving 100% statement, branch, complex condition coverage (MCDC). We have 5-20 lines of test code per line of flight code. Expensive, but that's what good life-critical SW costs!

    Tesla's model for "beta" app-like self-driving software mystifies me - isn't this just an invitation for lawsuits?

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK