Tech CEOs want us to believe that generative AI will benefit humanity. They are kidding themselves

  • Turkey_Titty_city@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I mean AI is already generating lots of bullshit ‘reports’. Like you know, stuff that reports ‘news’ with zero skill. It’s glorified copy-pasting really.

    If you think about how much language is rote, in like law and etc. Makes a lot of sense to use AI to auto generate it. But it’s not intelligence. It’s just creating a linguistic assembly line. And just like in a factory, it will require human review to for quality control.

    • 🐝bownage [they/he]@beehaw.org
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      The thing is - and what’s also annoying me about the article - AI experts and computational linguistics know this. It’s just the laypeople that end up using (or promoting) these tools now that they’re public that don’t know what they’re talking about and project intelligence onto AI that isn’t there. The real hallucination problem isn’t with deep learning, it’s with the users.

      • mrnotoriousman@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Spot on. I work on AI and just tell people “Don’t worry, we’re not anywhere close to terminator or skynet or anything remotely close to that yet” I don’t know anyone that I work with that wouldn’t roll their eyes at most of these “articles” you’re talking about. It’s frustrating reading some of that crap lol.

      • exohuman@kbin.socialOP
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        The article really isn’t about the hallucinations though. It’s about the impact of AI. its in the second half of the article.

    • fiasco@possumpat.io
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      This is the curation effect: generate lots of chaff, and have humans search for the wheat. Thing is, someone’s already gotten in deep shit for trying to use deep learning for legal filings.