• lunarul@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    24
    ·
    edit-2
    4 days ago

    LLMs reproduce the form of language without any meaning being transmitted. That’s called parroting.

    Even if (and that’s a big if) an AGI is going to be achieved at some point, there will be people calling it parroting by that definition. That’s the Chinese room argument.

      • lunarul@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        5
        ·
        3 days ago

        Me? How can I move goalposts in a single sentence? We’ve had no previous conversation… And I’m not agreeing with the previous poster either…

        • Prunebutt@slrpnk.net
          link
          fedilink
          English
          arrow-up
          6
          ·
          3 days ago

          By entering the discussion, you also engaged in the previops context. The discussion uas about LLMs being parrots.

          • lunarul@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            3 days ago

            And the argument was if there’s meaning behind what they generate. That argument applies to AGIs too. It’s a deeply debated philosophical question. What is meaning? Is our own thought pattern deterministic, and if it is, how do we know there’s any meaning behind our own actions?

            • Prunebutt@slrpnk.net
              link
              fedilink
              English
              arrow-up
              3
              ·
              3 days ago

              The burden of proof lies on the people making the claims about intelligence. “AI” pundits have supplied nothing but marketing-hype.