Running AI models without matrix math means far less power consumption—and fewer GPUs?

    • FaceDeer
      link
      fedilink
      16 months ago

      I don’t think that making LLMs cheaper and easier to run is going to “pop that bubble”, if bubble it even is. If anything this will boost AI applications tremendously.