• 1 Post
  • 94 Comments
Joined 2 years ago
cake
Cake day: June 13th, 2023

help-circle






  • Limeey@lemmy.worldtoScience Memes@mander.xyzHuh
    link
    fedilink
    English
    arrow-up
    79
    arrow-down
    4
    ·
    9 months ago

    It all comes down to the fact that LLMs are not AGI - they have no clue what they’re saying or why or to whom. They have no concept of “context” and as a result have no ability to “know” if they’re giving right info or just hallucinating.