• @lunarul@lemmy.world
    link
    fedilink
    English
    -20
    edit-2
    5 months ago

    LLMs reproduce the form of language without any meaning being transmitted. That’s called parroting.

    Even if (and that’s a big if) an AGI is going to be achieved at some point, there will be people calling it parroting by that definition. That’s the Chinese room argument.

      • @lunarul@lemmy.world
        link
        fedilink
        English
        -45 months ago

        Me? How can I move goalposts in a single sentence? We’ve had no previous conversation… And I’m not agreeing with the previous poster either…

        • @Prunebutt@slrpnk.net
          link
          fedilink
          English
          75 months ago

          By entering the discussion, you also engaged in the previops context. The discussion uas about LLMs being parrots.

          • @lunarul@lemmy.world
            link
            fedilink
            English
            05 months ago

            And the argument was if there’s meaning behind what they generate. That argument applies to AGIs too. It’s a deeply debated philosophical question. What is meaning? Is our own thought pattern deterministic, and if it is, how do we know there’s any meaning behind our own actions?

            • @Prunebutt@slrpnk.net
              link
              fedilink
              English
              35 months ago

              The burden of proof lies on the people making the claims about intelligence. “AI” pundits have supplied nothing but marketing-hype.