Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths

  • CmdrShepard@lemmy.one
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    I’d hope they would before willfully getting behind the controls of one to operate it. Regardless of what we call it, these people still would have crashed. They both drove into the side of a semi while sitting in the driver’s seat with their hands on the wheel.

    • renohren@partizle.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      That because tesla induced them to think it was level 4 or 5 while FSD is level 2 (like most Toyotas are ) but with a few extra options.

      And as long as there is a need for a human to endorse responsability, it will remain at level 2.

      • CmdrShepard@lemmy.one
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        A) Autopilot and the FSD beta are two totally separate systems and FSD wasn’t even available as an option when one of these crashes occurred.

        B) where’s the evidence that these drivers believed they were operating a level 4 or 5 system?

        • renohren@partizle.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          I’ll repeat what others have said: they could have falled it “Advanced Autopilot”, “Advanced Driving Assist”, “Magic Drive”, “So Close To Our Goal”, “Monitored Self Driving”, “Assisted Self Driving” is as close as I can get … Anything BUT “Full Self Driving”.