• Ookami38@sh.itjust.works
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    10 months ago

    The broad answer is, I’m pretty sure everything you’ve mentioned is possible, and you’re right in that this is similar to how humans integrate new data. Everything we learn competes with and bolsters every bit of knowledge we already have, so our web of understanding is this ever shifting net of relationships between concepts.

    I don’t see any reason these kinds of relationships can’t be integrated into generative AI, they just HAVEN’T yet, and each time you increase how the relationships interact, you’re also drastically increasing the size and complexity of the algorithm and model. I think we’re just realizing that what we have now is OK, but needs to be significantly better before it’s really mind blowing.

    • Ottomateeverything@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      I don’t see any reason these kinds of relationships can’t be integrated into generative AI, they just HAVEN’T yet

      No, it’s just fucking pointless. You’re talking about adding sand to a beach. These things are way more complicated and trying to shovel these things in just makes a mess. See literally the OP.

      each time you increase how the relationships interact, you’re also drastically increasing the size and complexity of the algorithm and model.

      No youre not. Not even fucking close. You clearly don’t understand this at all.

      The ALGORITHM will always be the same. Except for new generations of these bots. Claiming adding things like racial bias is going to alter the algorithm is just nonsensical.

      The MODEL is the huge fucking corpus of internet data. Anything you tack onto it is a drop in an ocean. It’s not steering anything.

      Whats changing is they’re editing inputs because that’s all you can really do to shift where these things go. Other changes would turn this into a very different beast, and can’t be done at the fine grained level like “race”.

      Claiming this has any significant impact on the size or complexity of any of this is just total hog wash and you must not understand how these work or how big they are.

      • Ookami38@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        10 months ago

        In what world does changing the algorithm used in order to generate anything, something that would be NECESSARY to make the model incorporate a new dimension of data, not change the algorithm used to generate?

        I’m not just talking adding more prompts, keying more specific terms to specific patterns of pixels, I’m talking building in entirely new ways for the AI to understand.

        You seem to think I’m just talking about linearly expanding the vocabulary of the model, I’m talking about giving it an entirely new paradigm through which to work.

        Anyway, this is why no one likes pedants. If you want to actually engage in conversation, sure. If you want to just keep being a vitriolic ass, go back to your cave, yeah?

        • Ottomateeverything@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          10 months ago

          You seem to think I’m just talking about linearly expanding the vocabulary of the model, I’m talking about giving it an entirely new paradigm through which to work.

          No, I don’t. I know exactly what you’re trying to say. But you’re basically talking about trying to make a car fly. That’s not how it was built and it’s goals and foundations are entirely different. You’re better off starting over and building a plane. Your proposal just doesn’t fit within the paradigms of what was built and makes no sense.

          I’m talking building in entirely new ways for the AI to understand.

          Exactly. But the AI doesn’t “understand” anything. In order to achieve this, you need to build something that “understands” things. LLMs don’t understand anything.

          Anyway, this is why no one likes pedants. If you want to actually engage in conversation, sure.

          It’s easy to label me as a pendant, but I’m explaining how this stuff works. You clearly have no idea, admitted yourself that you don’t understand, and then keep going. You just keep spewing the same shit, but the shit you’re spewing makes no sense. But you refuse to budge or engage in conversation here.

          You’re just talking out of your ass. You’re admittedly uneducated but want to be treated like you’re educated and make any sense. You don’t. This is why people hate people pretending to be experts and talking about things they don’t understand. It’s a waste of time.

          If you want to keep living in some imaginary world where this can be done, be my guest, but it’s fake. That’s not how this shit works. Enjoy your imaginary quest though.

    • Flumpkin@slrpnk.net
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      10 months ago

      Yeah, I imagine generative AI as like one small part of a human mind, so we’d need to create a whole lot more for AGI. But it’s shocking (at least for me) that it works at all just through more data and compute power. That you can make qualitative leaps with just increasing the quantity. Maybe we’ll see more progress now.