• @ZILtoid1991@lemmy.world
    link
    fedilink
    86 months ago

    There’s also jailbreaking the AI. If you happen to work for a trollfarm, you have to be up to date with the newest words to bypass its community guidelines to make it “disprove” anyone left of Mussolini.

    • threelonmusketeers
      link
      fedilink
      English
      36 months ago

      I tried some of the popular jailbreaks for ChatGPT, and they just made it hallucinate more.

    • ferret
      link
      fedilink
      English
      26 months ago

      You can skip that bullshit and just run the latest and greatest open source model locally. Just need a thousand dollar gpu