• WhyJiffie@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    1
    ·
    17 days ago

    It’s also worth noting that you can ask tools like ChatGPT for it’s references.

    last time I tried that it made up links that didn’t work, and then it admitted that it cannot reference anything because of not having access to the internet

    • Greg Clarke@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      9
      ·
      17 days ago

      That’s my point, if the model returns a hallucinated source you can probably disregard it’s output. But if the model provides an accurate source you can verify it’s output. Depending on the information you’re researching, this approach can be much quicker than using Google. Out of interest, have you experienced source hallucinations on ChatGPT recently (last few weeks)? I have not experienced source hallucinations in a long time.

      • 31337@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        16 days ago

        I use GPT (4o, premium) a lot, and yes, I still sometimes experience source hallucinations. It also will sometimes hallucinate incorrect things not in the source. I get better results when I tell it not to browse. The large context of processing web pages seems to hurt its “performance.” I would never trust gen AI for a recipe. I usually just use Kagi to search for recipes and have it set to promote results from recipe sites I like.