I’m usually the one saying “AI is already as good as it’s gonna get, for a long while.”

This article, in contrast, is quotes from folks making the next AI generation - saying the same.

    • @blackbelt352@lemmy.world
      link
      fedilink
      English
      2510 hours ago

      It’s a lot. Like a lot a lot. GPUs have about 150 billion transistors but those transistors only make 1 connection in what is essentially printed in a 2d space on silicon.

      Each neuron makes dozens of connections, and there’s on the order of almost 100 billion neurons in a blobby lump of fat and neurons that takes up 3d space. And then combine the fact that multiple neurons in patterns firing is how everything actually functions and you have such absurdly high number of potential for how powerful human brains are.

      At this point, I’m not sure there’s enough gpus in the world to mimic what a human brain can do.

    • @cron@feddit.org
      link
      fedilink
      English
      611 hours ago

      I don’t think your brain can be reasonably compared with an LLM, just like it can’t be compared with a calculator.

      • @GetOffMyLan@programming.dev
        link
        fedilink
        English
        119 hours ago

        LLMs are based on neural networks which are a massively simplified model of how our brain works. So you kind of can as long as you keep in mind they are orders of magnitude more simple.