• StellarExtract@lemm.ee
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    edit-2
    4 days ago

    How about “a robot must have complete loyalty to its owner, even if this is not in the best interests of its manufacturer”. Fat chance, I know.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      12
      ·
      3 days ago

      Technically the laws of robotics already have that.

      Law 2: a robot must obey any order given to it by a human as long as such order does not conflict with the first law.

      Of course that’s little help, because the laws of robotics are intentionally designed not to work.

      • Evil_incarnate@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        3 days ago

        Wouldn’t be much of a short story if they did.

        I liked the one where the robot could sense people’s emotional pain, and went crazy when it had to deliver bad news.

        • Nikelui@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          Yup, and later Asimov expanded this short story into a saga that brought to the birth of law Zero:

          A robot may not harm humanity, or, by inaction, allow humanity to come to harm.