Tesla drivers run Autopilot where it’s not intended — with deadly consequences::undefined

    • Nurse_Robot@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      1 year ago

      A Tesla driving on Autopilot crashed through a T intersection at about 70 mph and flung the young couple into the air, killing Benavides Leon and gravely injuring Angulo. In police body-camera footage obtained by The Washington Post, the shaken driver says he was “driving on cruise” and took his eyes off the road when he dropped his phone.

      But the 2019 crash reveals a problem deeper than driver inattention. It occurred on a rural road where Tesla’s Autopilot technology was not designed to be used. Dash-cam footage captured by the Tesla and obtained exclusively by The Post shows the car blowing through a stop sign, a blinking light and five yellow signs warning that the road ends and drivers must turn left or right.

      • CmdrShepard@lemmy.one
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        3
        ·
        1 year ago

        It’s insane that people are blaming this on Autopilot when there is a driver sitting behind the wheel who also missed a stop sign, blinking light, and five yellow warning signs while driving at 70MPH. You could physically do this with any other car that has cruise control and nobody would be blaming the car.

            • XeroxCool@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              The above comment, which was a summary of the article, doesn’t blame autopilot. It brings up autopilot as being used by an inattentive driver outside autopilot’s intended use conditions. Acting like autopilot, it’s marketing, and it’s general population perception is an innocent bystander in this situation is, however, disingenuous. You don’t give a car to someone and say “it has airbags, it’s safe” and trust that they’ll actually be ok on the road with no further info, right? So why would you think releasing untested software in a product with overhyped marketing using unfamiliar terms^1 would just be ok?

              1. The gen pop thinks autopilot can land planes. Any autopilot.
          • jet@hackertalks.com
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            3
            ·
            edit-2
            1 year ago

            I’ll blame the auto pilot. It’s good enough to train people to not pay attention, but not good enough to be fully driverless. So the users are being trained to fully trust something they can’t fully trust.

            If you were trying to teach a new driver how to drive, you wouldn’t do 99.9% of the driving for them, and then randomly throw them into the driver’s seat when there’s an emergency. That’s not how you would get a good driver, that’s how you would get a bunch of accidents. We know that at the human level. If you want to train a driver, you let them practice on the easy stuff, you keep them engaged, you keep them thinking about it.

            The Tesla semi-automated self-driving, is the reverse, all the easy stuff the computer does, the rare emergency difficult stuff the human has to do. But they get no practice. It’s like the gold standard of how to create accidents

    • SomeoneSomewhere@lemmy.nz
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      1 year ago

      In user manuals, legal documents and communications with federal regulators, Tesla has acknowledged that Autosteer, Autopilot’s key feature, is “intended for use on controlled-access highways” with “a center divider, clear lane markings, and no cross traffic.” Tesla advises drivers that the technology can also falter on roads if there are hills or sharp curves, according to its user manual. Even though the company has the technical ability to limit Autopilot’s availability by geography, it has taken few definitive steps to restrict use of the software.

      Tesla told NTSB that design limits for Autopilot would not be appropriate because “the driver determines the acceptable operating environment.”

      He said Tesla could easily limit where the technology can be deployed. “The Tesla knows where it is. It has navigation. It knows if it’s on an interstate or an area where the technology wasn’t designed to be used,” he said. “If it wasn’t designed to be used there, then why can you use it there?”

      In a sworn deposition last year first detailed by Reuters and obtained by The Post, Tesla’s head of Autopilot, Ashok Elluswamy, said he was unaware of any document describing limitations on where and under what conditions the feature could operate. He said he was aware of some activation conditions for Autopilot, including the presence of lane lines, and that it is safe for “anyone who is using the system appropriately.”

      Tesla’s commitment to driver independence and responsibility is different from some competitors, whose driver-assistance technologies are loaded with high-definition maps with rigorous levels of detail that can tip vehicles off to potential roadway hazards and obstructions. Some manufacturers, including Ford and General Motors, also only allow the technology to work on compatible roadways that have been meticulously mapped.

      Over the years, NTSB has repeatedly called on NHTSA to rein in Autopilot. It also has urged the company to act, but Homendy said Tesla has been uniquely difficult to deal with when it comes to safety recommendations. Tesla CEO Elon Musk once hung up on former NTSB chair Robert Sumwalt, said the former chief, who retired from the agency in 2021 when Homendy took over.

      https://web.archive.org/web/20231210125240/https://www.washingtonpost.com/technology/2023/12/10/tesla-autopilot-crash/