- cross-posted to:
- machine_learning@programming.dev
- technology@lemmy.world
- cross-posted to:
- machine_learning@programming.dev
- technology@lemmy.world
Running AI models without matrix math means far less power consumption—and fewer GPUs?
Running AI models without matrix math means far less power consumption—and fewer GPUs?
Let’s pop that bubble
I don’t think that making LLMs cheaper and easier to run is going to “pop that bubble”, if bubble it even is. If anything this will boost AI applications tremendously.