• mofongo@lemm.ee
    link
    fedilink
    arrow-up
    157
    arrow-down
    3
    ·
    1 year ago

    Thats actually a really good dilemma if you think about it. Like if everyone doubles it you basically don’t kill anyone. But you’ll always risk that there’s some psycho who likes killing and then you will have killed more. And if these choices continue endlessly you will eventually find someone like this. So killing immediately should be the right thing to do.

    • CookieJarObserver@sh.itjust.works
      link
      fedilink
      arrow-up
      30
      arrow-down
      1
      ·
      1 year ago

      Some day it reaches a person that thinks…

      Well, 4 billion people less is better than someone being able to wipe out humanity…

      (it would also solve many problems lol)

      (and that point would be after 32 people had the choice…)

      • ArbitraryValue@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        1 year ago

        Meanwhile Thanos is on the third switch and very frustrated. (He would double it and pass it to the next person - there’s no point in killing four people when there’s a chance that the second-to-last guy might kill half of humanity.)

    • The Snark Urge@lemmy.world
      link
      fedilink
      arrow-up
      24
      arrow-down
      3
      ·
      1 year ago

      This is really the only answer. The only thing that makes it “hard” is having to face the brutality of moral calculus

      • LazaroFilm@lemmy.world
        link
        fedilink
        arrow-up
        9
        ·
        edit-2
        1 year ago

        Now, what if you’re not the first person on the chain? What if you’re the second one. Or the n one? What now? Would you kill two or n knowing that the person before you spared them?

        • Neato@kbin.social
          link
          fedilink
          arrow-up
          20
          ·
          1 year ago

          The thing to do is kill now even if it’s thousands. Because it’s only going to get worse.

          The best time to kill was the first trolly. The second best time to kill is now.

          • apollo440@lemmy.world
            link
            fedilink
            arrow-up
            6
            ·
            1 year ago

            Yes, but it also kinda depends on what happens at and after junction 34, from which point on more than the entire population of earth is at stake.

            If anything, this shows how ludicrously fast exponentials grow. At the start of the line it seems like there will be so many decisions to be made down the line, so there must be a psycho in there somewhere, right? But (assuming the game just ends after junction 34) you’re actually just one of 34 people, and the chance of getting a psycho are virtually 0.

            Very interesting one!

            • The Snark Urge@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              It’s not that interesting. If you rephrase the question as a choice between a good option and a less good one, it’s still barely even a choice.

              “Would you rather have only one (or, say, trillions) die now, or would you like to allow *at a minimum *twice that many people die the second we talk to a sadist?”

              If you can’t choose the smaller number, all it means is that you lack moral strength - or the test proctor has put someone you know on the tracks, which is cheating. A highly principled person might struggle if choosing between their daughter and one other person. If it’s my kid versus a billion? That’s not a choice, that’s just needless torture. Any good person would sacrifice their kid to save a billion lives. I take that as an axiom, because anything else is patently insane.

              • apollo440@lemmy.world
                link
                fedilink
                arrow-up
                4
                ·
                edit-2
                1 year ago

                Kill fewer people now is obviously the right answer, and not very interesting.

                What is interesting is that the game breaks already at junction 34, which is unexpectedly low.

                So a more interesting dilemma would have been “would you kill n people now or double it and pass it on, knowing the next person faces the same dilemma, but once all humanity is at stake and the lever is not pulled, the game ends.”. Because that would involve first of all figuring out that the game actually only involves 34 decisions, and then the dilemma becomes “do I trust the next 33-n people not to be psychos, or do I limit the damage now?”. Even more interestingly “limiting the damage now” makes you the “psycho” in that sense…

                • The Snark Urge@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  edit-2
                  1 year ago

                  The fact of the game never ending is what made the choice too easy, you’re right.

                  EDITED

                  For this study you want sociopathy, not psychopathy. I can report from my wasted psych degree that sociopathy occurs in 1-2% of the population.

                  Binary probability tells us that if you repeat a 1% chance test 32 times, you have a 95% chance of never seeing it.

                  Don’t pull the lever. Sorry for the ninja edit, I misread something.

          • Rodeo@lemmy.ca
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            How could you know someone else is going to do it though? And how is their decision your responsibility?

            If you kill someone you are a killer. It’s that simple.

    • atlasraven31@lemm.ee
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      1 year ago

      Eventually there might also be a track with no people on it so postponing the dilemma becomes much better than at least 1 death. But there is no way of knowing what the future dilemma might be.

    • Sotuanduso@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      But you’ll always risk that there’s some psycho who likes killing and then you will have killed more.

      I disagree. The blood is not on your hands.

      Suppose you see someone walking towards a bank with a gun. You have an opportunity to steal their gun. If you don’t, and they go on to kill 5 people in an armed robbery, is the blood on your hands?

      Suppose you see a hunter in the woods with a gun. You have an opportunity to kill them. If you don’t, and they go fire on a city street and kill 5 people, is the blood on your hands?

      Suppose you see a juvenile delinquent on the path to being a serial killer. You have an opportunity to kill an old lady in front of them to scare them straight. If you don’t, and they go on to kill 5 people, is the blood on your hands?

      Suppose you see a newborn baby. You have an opportunity to kill them. If you don’t, and they grow up to become a terrorist and kill 5 people, is the blood on your hands?

        • Sotuanduso@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          If I have to kill a person to stop a chance that a random person will be evil or misguided enough to choose to kill millions, it’s not worth it.

          Murder is wrong, and that’s an absolute. And then someone’s gonna come in with the “what if you have to kill someone to stop nuclear war from destroying the earth, and you can’t just get the authorities for some reason?”

    • CarbonIceDragon@pawb.social
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Assuming of course that it goes on forever. Which admittedly seems like what one is intended to think, but the graphic doesn’t actually show or state that, and realistically, if actually given this scenario, it shouldn’t, because eventually some limit will be encountered that makes it impossible for the problem to physically exist (like running out of people to tie to the tracks, running out of space for them, having such a large amount of stuff in one space that it undergoes gravitational collapse, the finite size of the observable universe making fitting an infinite dilemma impossible, etc.)

    • HummingBee@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      That leads to another interesting split path. Maybe it’s best to just kill the one right away. Assuming this goes on forever, it’s basically inevitable that someone somehow will end up killing an obscene number of people eventually. But maybe it’d be like nukes, and eventually reach a point where flipping the lever is just mutually assured destruction, and no one would ever actually do that

    • dan1101@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Yeah so it would be tough to decide if you wanted to be at an early, middle, or late junction. All depends on how to people on the switches think.

    • foggy@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      It’s a bad dilemma because if we repeat the process we only end up with one deranged lunatic.

  • beaubbe@lemmy.world
    link
    fedilink
    arrow-up
    59
    ·
    1 year ago

    You gotta double it until it overflows to negatives, then you end up reviving billions of people!

    • Sotuanduso@lemm.ee
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      1 year ago

      That implies that if nobody tries to stop climate change, it’ll never destroy the world.

  • evdo@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    34
    ·
    1 year ago

    If I must kill 1 person or cause even more death, I suppose I’d kill the person responsible for this scenario.

    • Rodeo@lemmy.ca
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      They simply have to choose not kill anyone.

      Nobody in this situation ever has to die. It is not some difficult choice that you are burdening the next person with. The choice is obvious.

  • Afflictedlife@lemmy.ml
    link
    fedilink
    arrow-up
    19
    arrow-down
    1
    ·
    1 year ago

    Loop continues until entire human population tied to track and there’s nobody left to pass the switch to. kill the scapegoat on round one and done

  • ApfelstrudelWAKASAGI@feddit.de
    link
    fedilink
    arrow-up
    17
    arrow-down
    1
    ·
    edit-2
    1 year ago

    You would need a crazy low probability of a lunatic or a mass murderer being down the line to justify not to kill one person

    Edit: Sum(2^n (1-p)^(n-1) p) ~ Sum(2^n p) for p small. So you’d need a p= (2×2^32 -2) ~ 1/(8 billion) chance of catching a psycho for expected values to be equal. I.e. there is only a single person tops who would decide to kill all on earth.

    • m0darn@lemmy.ca
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      1 year ago

      Well what about the fact that after 34 people the entire population is tied to the tracks. What are the chances that one person out of 35 wants to destroy humanity?

      Also thing the entire human population to the tracks is going to cause some major logistical problems, how are you going to feed them all?

      • ApfelstrudelWAKASAGI@feddit.de
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        I just calculated the sum from n=0 to 32 (because 2^33>current global population). And that calculation implies that the chance of catching someone willing to kill all of humanity would have to be lower than 1/8 billion for the expected value of doubling it to be larger than just killing one person.

        • m0darn@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Yeah I think I was in a stupor when I commented. I don’t think I even tried to understand your comment. My apologies. But now that I am trying, I am struggling to understand the notation.

      • SeaJ@lemm.ee
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        Oh come on. A trolley is not going to have the momentum to kill that many people nor would the machinery make it through. The gears and whatnot would be totally gummed up after like 20 or so people.

    • ChrisGrantsBrownlow@aussie.zone
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      You don’t even need a lunatic or mass murderer. As you say, the logical choice is to kill one person. For the next person, the logical choice is to kill two people, and so on.

      • ApfelstrudelWAKASAGI@feddit.de
        link
        fedilink
        arrow-up
        17
        ·
        1 year ago

        It does create the funny paradox where, up to a certain point, a rational utilitarian would choose to kill and a rational mass murderer trying to maximise deaths would choose to double it.

            • interdimensionalmeme@lemmy.ml
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              edit-2
              1 year ago

              Doubling action forever minimizes human deaths.

              Unless someone decide to hit kill. In that case, it’s them doing it. I’m invalidating the argument that pre-empting imaginary future mass murders justifies killing one person today.

              • ApfelstrudelWAKASAGI@feddit.de
                link
                fedilink
                arrow-up
                3
                ·
                1 year ago

                Idk which moral system you operate under, but I’m concerned with minimising human suffering. That implies hitting kill because chances of a mass murderer are too high not to. You also don’t follow traffic laws to a t, but exercise caution because you don’t really care whose fault it ends up being, you want to avoid bad outcomes (in this case the extinction of humankind).

                • interdimensionalmeme@lemmy.ml
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 year ago

                  My moral system somehow does not chose to kill people through action against an imagined threat and is therefore objectively superior as is it not susceptible to hostile memetic manipulation (Molloch, Pascal’s wager, Pascal’s mugging, basilisks, social hysteria etc.) and is capable of escaping false choices and other contrived scenarios, breaking premise and the rules of the game as needed to obtain the desired outcome.

  • Grenmark@lemmy.ml
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    I was gonna just do the one but they do say it’s best to pay it forward when you can.