• Shou@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      9 hours ago

      Dunno man, I don’t want to risk the lives of those in normal cars. Just one bad day on an overworked guy at night and you’ll have the same issue.

  • JasonDJ@lemmy.zip
    link
    fedilink
    arrow-up
    74
    arrow-down
    1
    ·
    edit-2
    2 days ago

    This is from Mark Robers YouTube Channel.

    Credit where credit is due.

    Mark Robers is fucking awesome, btw. He does all sorts of fun STEM things and keeps it interesting for kids and adults.

    • Woht24@lemmy.world
      link
      fedilink
      arrow-up
      17
      arrow-down
      1
      ·
      2 days ago

      I guess credit is due, but with a basic understanding of the technology, I would be surprised if it did stop. Seems like a very set up and timed thing he’s done just as a fuck you to Elon.

      Which Elon deserves, he’s a fucking flog. I just didn’t think the video was that impressive.

      • Carighan Maconar@lemmy.world
        link
        fedilink
        arrow-up
        14
        ·
        edit-2
        2 days ago

        Well the set up is “Some rich fucker was idiot enough to dictate his company should not install LIDAR, no matter how strictly superior it’d be to do so”.

        So I dunno. We could ask Melon Husk why he set that up, but I doubt he’ll reply. All Mark Robers did is observe the effect of the setup of Edolf Muskler. You’re right of course in so far that there’s not much to the video then, and I faintly remember there was a case a lot of years ago where a Tesla drove into a sky-blue truck because coming up on a crest it expected to only see hill, which matched the color of said truck.

      • JasonDJ@lemmy.zip
        link
        fedilink
        arrow-up
        22
        ·
        2 days ago

        One of the tests from that video, the Tesla passed and I was honestly surprised it could. Can’t remember if it was heavy fog or bright oncoming lights.

        Either way, yeah, LIDAR is far more capable and beat the Tesla in every test. Incredibly foolish to go camera-only but hey, that’s Elon.

        • perestroika@lemm.ee
          link
          fedilink
          arrow-up
          19
          ·
          edit-2
          2 days ago

          It was the lights test.

          In the fog test, it plowed through the mannequin kid (and in real life, they’ve been observed plowing through deer).

      • Natanael@infosec.pub
        link
        fedilink
        arrow-up
        6
        ·
        2 days ago

        It’s certainly possible, although harder. Parallax effects should be visible in a real road, which it wouldn’t be in a painting of a road.

        But it doesn’t track how the image changes as it moves, so Teslas can’t catch that.

    • Gladaed@feddit.org
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 day ago

      Also they explicitly endorsed buying Tesla’s still for they are good enough.

  • Jesus@lemmy.world
    link
    fedilink
    arrow-up
    130
    arrow-down
    1
    ·
    2 days ago

    People are arguing about autopilot being disabled during the drive, but even if it was, the emergency braking system should tried to do something.

    • Cyrus Draegur@lemm.ee
      link
      fedilink
      English
      arrow-up
      203
      arrow-down
      4
      ·
      edit-2
      2 days ago

      Autopilot literally switched ITSELF off less than half a second from the moment of impact. it didn’t try to stop the car, it just shut itself off so it couldn’t be blamed.

      Imagine if someone whipped throwing knives at your back and then tried to argue “but your honor I was not holding any knives at the time of the stabbing”

      Fuck Tesla, fuck Elon, fuck every simp who shits excuses out their mouths for him

      (I mean, not YOU, you aren’t doing any of those things; I’m just saying, those people. In general.)

      • FauxLiving@lemmy.world
        link
        fedilink
        arrow-up
        68
        arrow-down
        1
        ·
        2 days ago

        It’s like a pilot bailing out of a plane and then claiming he was not responsible for the crash because he was in Vegas at the time the plane crashed.

      • pyre@lemmy.world
        link
        fedilink
        arrow-up
        13
        ·
        2 days ago

        it’s always been doing this. it’s so that they claim AP wasn’t active during the crash and evade liability

      • Ledericas@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        I think elon made sure that it switches off so he doesnt get liability that the autopilot is at fault.

      • Jesus@lemmy.world
        link
        fedilink
        arrow-up
        20
        arrow-down
        10
        ·
        2 days ago

        AP is supposed to disable itself if a fault or abnormality is detected. Pretty much all advanced cruise control systems do this.

        I don’t think it’s fair to say the car was hiding evidence of AP being used unless it was intentionally logging the data in shady way. We’d need to see the logs of the car, and there are some roundabout ways for a consumer to pull those. That would probably be an interesting test for someone on YouTube to run.

        • mosiacmango@lemm.ee
          link
          fedilink
          arrow-up
          41
          arrow-down
          1
          ·
          edit-2
          2 days ago

          These systems disable right before a crash because the national traffic safety org in the US requires manufacturers to report if these systems were engaged during an accident.

          It is not for safety or because of a malfunction, it’s for marketing. Car companies dont want the features that they sell for 3-8k coming up all the time in crash statistics.

          Tesla is the biggest offender here, likely due to vehicles sold, but also due to their camera only system and their aggressively false “full self driving” and “autopilot” marketing that far over promises.

          • Jesus@lemmy.world
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            2 days ago

            Just saying I’d like to see some more data. I get that Musk is not someone who should be trusted. Especially if it’s around complying with regulators.

            That said, I could see that system being disengaged by some intended safety triggers.

            • Redjard@lemmy.dbzer0.com
              link
              fedilink
              arrow-up
              8
              ·
              2 days ago

              At the very least the system should initiate an emergency break when it disengages like that and there is no conflicting human input.

              • Jesus@lemmy.world
                link
                fedilink
                arrow-up
                4
                ·
                2 days ago

                100% agree. My stupid Volvo does that, and it doesn’t have lidar or a million cameras around it.

    • Darkassassin07@lemmy.ca
      link
      fedilink
      English
      arrow-up
      100
      ·
      edit-2
      2 days ago

      Mark did an interview with Philip Defranco and posted raw footage showing/explaining that Autopilot turned itself off instead of hitting the brakes.

      They also did two takes, it did the same thing both times. The first time, they just used a poster instead of a full foam wall. They decided to add the foam for a better visual once they realized it would just happily plow through it.

      Finally there’s some argument of Autopilot vs FSD; but both rely on the same cameras and should have at least tried to brake. The LIDAR car braked and it was just using emergency braking, no self driving at all.

    • Carighan Maconar@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      2 days ago

      Why would it? It sees a road in front of itself, the whole car is built + programmed to go by what it sees, as an image.

      The car is doing exactly what it is built to do. It just so happens that “safety of road traffic” is not among the things it is built for, and explicitly so.

      • LeninOnAPrayer@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        18 hours ago

        I think it’s a good example of how vastly over hyped a lot of AI stuff is though. Any person could tell that that was a wall from hundreds of feet away. You can even tell in the video and it would be more obvious in real life with actual depth perception.

        The entire problem with AI are corner cases. This is just taking that corner case to an absurd level. But it’s not much different than the real world case of a Tesla vehicle getting confused on a highway if a billboard ad has a stop sign on it.

    • takeda@lemm.ee
      link
      fedilink
      arrow-up
      51
      ·
      2 days ago

      It shows they didn’t even watch the video. The only reason Mark used autopilot was because without it the car failed the simplest test.

      The test was with a kid being in the middle of the street and Tesla didn’t even stop in time.

      https://youtu.be/IQJL3htsDyQ?t=9m38s

      • pivot_root@lemmy.world
        link
        fedilink
        arrow-up
        22
        ·
        2 days ago

        It shows they didn’t even watch the video.

        Of course they didn’t. Musk fanboys are an echo chamber of morons worshipping a Nazi oligarch. They’re quick to react, and they dismiss evidence and facts if it doesn’t suit their narrative.

    • boonhet@lemm.ee
      link
      fedilink
      arrow-up
      13
      ·
      edit-2
      2 days ago

      I watched some Tesla-sympathetic youtuber for balance and here are the key points brought up:

      1. He had a death grip on the wheel (because y’know, he knew he was going to crash). Exerting enough force over time on the steering wheel disables autopilot, because the system assumes you want to manually override what it’s doing.

      2. FSD apparently is much more capable, but this Tesla only had the common AutoPilot turned on. Despite having FSD available (Mark apparently claimed he didn’t know he could turn it on without adding a destination)

      3. Mark might have some sort of sponsorship deal with the LIDAR company featured in the video, which is why LIDAR was shown in a much better light (e.g it was shown stopping for a dummy behind the water spray, but in reality a LIDAR based system would just brake for the water spray itself)

      Now all of those might be true, but you’re also correct in that the emergency braking system should be operational even when AP is disabled. Unless the system malfunctioned (just having a dirty camera is enough). I know my Subaru throws out the adaptive cruise ALL the time. Stupid camera based system. You’d think it’s better off because the cameras are at the top of the windshield, compared to most cars front grille mounted radars, but nah, it just keeps turning off.

      • Honytawk@lemmy.zip
        link
        fedilink
        English
        arrow-up
        18
        ·
        edit-2
        2 days ago

        They are free to peer review the test and do it with all the stuff enabled.

        That is how science works.

        But I doubt they will, since this is an inherent problem with using camera vision only. Not with the software of the car. And they most likely know it.

        • KubeRoot@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 day ago

          I will point out, I don’t think “peer review” means repeating the test, it means more generally pointing out issues with the science, right? By that definition, sounds like that’s what they’re doing. That doesn’t make the criticisms inherently valid, but to dismiss it as “they’re free to do their own tests” because “that is how science works” seems dishonest.

          • madnotangry@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            5 hours ago

            Peer review usually means repeating the test and comparing results with the original paper. If peer review can’t get the same results, it means that the first study was faulty or wasn’t described accurately.

        • boonhet@lemm.ee
          link
          fedilink
          arrow-up
          1
          arrow-down
          9
          ·
          2 days ago

          Humans also operate on “camera vision” only in that we see visible light and that’s it. Adding lidar to the system should improve performance over human capability, but camera vision with good enough software (and this is way easier said than done) ought to be able to match human capability. Whether Tesla’s is good enough in FSD mode I have no idea because I have no intention to ever buy one and testing this in a rental is uh… risky, given that they tend to have onboard cameras.

          Of course, if Tesla’s “FSD” branded driver assist suite is actually good enough to beat this test, I reckon Tesla would be quick to prove it to save their own reputation. It’s not that hard to reproduce.

          • LeninOnAPrayer@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 hours ago

            https://www.adafruit.com/product/4058?gQT=1

            These are extremely EXTREMELY reliable at detecting physical obstructions. There is no reason but stupidity or being cheap to not implement redundancy into a safety system. This isn’t about implementing “good enough” software. This is about a design choice forced on Tesla engineers by a complete idiot that doubles down on his stupidity when faced with criticism by actually intelligent people.

          • Mandrilleren@lemmy.world
            link
            fedilink
            arrow-up
            8
            ·
            2 days ago

            Not just good enough software. Also good enough cameras and good enough processing power. None of which curenty match humans so this is not a valid argument.

            The camera only system is just worse at everything.

      • greenhorn@lemm.ee
        link
        fedilink
        English
        arrow-up
        7
        ·
        2 days ago

        Even without the fanboys justifications, what did this test prove that the others didn’t, since it didn’t mimic a real world scenario like the tests where the tesla demolished the kid? I’ve driven through fog and lights and heavy rain, but have yet to encounter an unexpected Wile E Cayote wall in the road.

        • helpImTrappedOnline@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          1 day ago

          The absurd test was mostly for the spectacle/views. Sometimes science is doing wacky things because we’re curious to find the limits.

          Someone else mentioned a blue truck at the crest of hill was invisible to the system, resulting in a crash. That’s probably the closest to Wile E scenerio you’re going to get.

      • Manalith@midwest.social
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        If nothing else, Mark did say that the company LiDAR supplied the car, but that’s it, they had no say in the test, didn’t give him any money, apparently they did put the video up on their site for a bit, but took it down either because it looked bad given the backlash, or because Mark told them to take it down as it did go against their agreement.

        Of course he could have lied about the spo sponsorship, but he said he’s fine with a lawsuit, so that would be a bold strategy.

  • DaddleDew@lemmy.world
    link
    fedilink
    arrow-up
    105
    ·
    2 days ago

    I love that they had pre-cut the styrofoam wall in a cartoony hole shape because they knew it was going to happen.

  • taiyang@lemmy.world
    link
    fedilink
    arrow-up
    35
    arrow-down
    1
    ·
    2 days ago

    You know, with how distracted your average Tesla driver is, I’m pretty sure this would trick them even with them “driving” with AP on.

    • FauxLiving@lemmy.world
      link
      fedilink
      arrow-up
      81
      ·
      2 days ago

      Teslas famously don’t use lidar because Musk declared that cameras were good enough. Reality disagrees, but reality owns no shares of Tesla.

      • BossDj@lemm.ee
        link
        fedilink
        arrow-up
        9
        ·
        2 days ago

        The first ten minutes is him “sneaking” in a small lidar unit to Disneyland be using it to make 3d models of the ride path. That’s pretty fun.

      • Ledericas@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        Elon musk dint want lidar because it cost too much, any tesla before 2018 had it, but it introduced update to brick all off them.

    • TheCoralReefsAreDying69@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      2 days ago

      A decent camera only vision system should still be able to detect the wall. I was actually shocked at the fact that Tesla failed this test so egregiously.

      If you use two side by side cameras you can determine distance to a feature by calculating the offset in position of the feature between the two camera images. I had always assumed this was how Tesla planned to achieve camera only FSD, but that would make too much sense.

      https://www.intelrealsense.com/stereo-depth-vision-basics/

      Even if they wanted to avoid any redundant hardware and only go with one camera for each direction, there is still a chance they could’ve avoided this kind of issue if they used structure through motion, but that’s much harder to do if the objects could be moving.

      https://en.m.wikipedia.org/wiki/Structure_from_motion

  • Hawke@lemmy.world
    link
    fedilink
    arrow-up
    14
    ·
    2 days ago

    I used to bullseye womp rats in my T-16 back home, they’re not much bigger than 2 meters.

  • LordKitsuna@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    24
    ·
    2 days ago

    Not defending Tesla or anything but let’s not pretend like the majority of people on the road looking at their phone would not have done the exact same thing

    • Red_October@lemmy.world
      link
      fedilink
      arrow-up
      12
      arrow-down
      2
      ·
      1 day ago

      Someone should make looking at your phone while driving illegal. Maybe even slap it with a catchy name like “distracted driving” and make the issue a whole big deal.

    • Polderviking@feddit.nl
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      1 day ago

      You’re not wrong with the comparison, but the whole point of the computer having control over the vehicle is that this doesn’t happen, despite the driver’s abject negligence.

    • AnyOldName3@lemmy.world
      link
      fedilink
      arrow-up
      31
      arrow-down
      2
      ·
      2 days ago

      Humans with two working eyes can tell the difference between a flat painted surface and a 3D world. Humans with only one eye might crash, though.

      • SpaceCowboy@lemmy.ca
        link
        fedilink
        arrow-up
        7
        ·
        2 days ago

        I doubt someone without depth perception would crash either. They’d notice the straps on the side, and things not being the exact colour shade. Might think it was a big piece of glass set up on the road, but that wouldn’t be something you’d just plow through.

      • daniskarma@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        4
        arrow-down
        6
        ·
        edit-2
        2 days ago

        Results may vary depending on the alcohol currently running in the human blood.

        One of the things I am advocate for self driving cars is that they cannot get drunk and drive.

        Edit: aaah yes, Lemmy, downvoted per telling that people drink and drive. Classic Lemmy.

        • Comtief@lemm.ee
          link
          fedilink
          arrow-up
          5
          ·
          1 day ago

          This is on bar with the comment that was something on the lines of “yes Tesla is bad and crashes but humans can be on their phone and crash too so why criticize self-driving cars?” Nice whataboutism.

          • daniskarma@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            1 day ago

            This is a response to a comment that said that a human being would drive better to specify that it’s being vastly proven that that is not the truth.

            But, once again, we are in Lemmy. So New Technology = bad.

        • piecat@lemmy.world
          link
          fedilink
          arrow-up
          8
          arrow-down
          1
          ·
          2 days ago

          Great, a computerized car gets tricked by things that innebriated humans might also get tricked by.

          That’s quite the bar being set.

          • daniskarma@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            1
            arrow-down
            6
            ·
            edit-2
            2 days ago

            Drunk humans get tricked by things that computerized cars does not, though.

            But by the state of the current butlerian yihad anything technologically advanced is to be criticized and destroyed, even if it saves lives.

            • Comtief@lemm.ee
              link
              fedilink
              arrow-up
              5
              ·
              edit-2
              1 day ago

              technologically advanced

              Err… sir this is about Teslas, not about technologically advanced self-driving cars.

              • daniskarma@lemmy.dbzer0.com
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                1 day ago

                Most people have exactly the same opinion about any self-driving car. Even those using lidar.

                Lemmy has a high anti technology bias.