• jackeryjoo@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    8
    ·
    1 month ago

    It’s fake. Llms don’t execute commands on the host machine. They generate text as a response, but don’t ever have access to or ability to execute random code on their environment

    • Ziglin (they/them)@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 month ago

      Some are allowed to by (I assume) generating some prefix that tells the environment to run the following statement. ChatGPT seems to have something similar but I haven’t tested it and I doubt it runs terminal commands or has root access. I assume it’s a funny coincidence that the error popped up then or it was indeed faked for some reason.