Shatur@lemmy.ml to Linux@lemmy.ml · 1 年前AMD Quietly Funded A Drop-In CUDA Implementation Built On ROCm: It's Now Open-Sourcewww.phoronix.comexternal-linkmessage-square22linkfedilinkarrow-up1517arrow-down15cross-posted to: linux@linux.communitytechnology@lemmy.zipstable_diffusion@lemmy.dbzer0.comprogramming@programming.devtechnology@lemmy.worldlinuxfurs@pawb.sociallinux@lemmy.worldprogramming@lemmy.mltechnology@lemmy.worldopensource@lemmy.ml
arrow-up1512arrow-down1external-linkAMD Quietly Funded A Drop-In CUDA Implementation Built On ROCm: It's Now Open-Sourcewww.phoronix.comShatur@lemmy.ml to Linux@lemmy.ml · 1 年前message-square22linkfedilinkcross-posted to: linux@linux.communitytechnology@lemmy.zipstable_diffusion@lemmy.dbzer0.comprogramming@programming.devtechnology@lemmy.worldlinuxfurs@pawb.sociallinux@lemmy.worldprogramming@lemmy.mltechnology@lemmy.worldopensource@lemmy.ml
minus-squareUraniumBlazer@lemm.eelinkfedilinkEnglisharrow-up12arrow-down1·edit-21 年前Cuda is required to be able to interface with Nvidia GPUs. AI stuff almost always requires GPUs for the best performance.
Cuda is required to be able to interface with Nvidia GPUs. AI stuff almost always requires GPUs for the best performance.