• @zazo@lemmy.world
    link
    fedilink
    06 months ago

    I’m sorry to break this to you - but you probably weren’t in the training dataset enough for the model to learn of your online presence - yes llms will currently hallucinate when they don’t have enough data points (until they learn their own limitations) - but that’s not a fundamentally unsolvable problem (not even top 10 i’d say)

    there already are models that consider their knowledge and just apologize if they can’t answer instead of making shit up. (eg claude)

    • Flying Squid
      link
      fedilink
      26 months ago

      Considering these LLMs are being integrated with search engines in a way that might work toward replacing them, don’t you think their training should include knowing who someone is that a bunch of hits come up for when you Google them?