• Flipper@feddit.org
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    6
    ·
    12 小时前

    Apparently it’s useful for extraction of information out of a text to a format you specify. A Friend is using it to extract transactions out of 500 year old texts. However to get rid of hallucinations the temperature reds to be 0. So the only way is to self host.

    • OhNoMoreLemmy@lemmy.ml
      link
      fedilink
      English
      arrow-up
      9
      ·
      11 小时前

      Setting the temperature to 0 doesn’t get rid of hallucinations.

      It might slightly increase accuracy, but it’s still going to go wrong.

    • daellat@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      12 小时前

      Well, LLMs are capable (but hallucinant) and cost an absolute fuckton of energy. There have been purpose trained efficient ML models that we’ve used for years. Document Understanding and Computer Vision are great, just don’t use a LLM for them.