☆ Yσɠƚԋσʂ ☆ to Technology@lemmy.mlEnglish • 5 months agoResearchers upend AI status quo by eliminating matrix multiplication in LLMsarstechnica.comexternal-linkmessage-square4fedilinkarrow-up18arrow-down16cross-posted to: technology@lemmygrad.ml
arrow-up12arrow-down1external-linkResearchers upend AI status quo by eliminating matrix multiplication in LLMsarstechnica.com☆ Yσɠƚԋσʂ ☆ to Technology@lemmy.mlEnglish • 5 months agomessage-square4fedilinkcross-posted to: technology@lemmygrad.ml
minus-square@theshatterstone54@feddit.uklinkfedilink2•edit-25 months agoWhy are people downvoting? This is huge and should make LLMs more power efficient and memory efficient.
minus-square☆ Yσɠƚԋσʂ ☆OPlinkfedilink-1•5 months agoIndeed, this seems like a big step forward, and here’s a link to the model https://github.com/ridgerchu/matmulfreellm
Why are people downvoting? This is huge and should make LLMs more power efficient and memory efficient.
Indeed, this seems like a big step forward, and here’s a link to the model https://github.com/ridgerchu/matmulfreellm