• Possibly linux
      link
      fedilink
      English
      510 months ago

      I’ll run which ever doesn’t require a bunch of proprietary software. Right now its neither.

      • Domi
        link
        fedilink
        810 months ago

        AMD’s ROCm stack is fully open source (except GPU firmware blobs). Not as good as Nvidia yet but decent.

        Mesa also has its own OpenCL stack but I didn’t try it yet.

        • Possibly linux
          link
          fedilink
          English
          010 months ago

          AMD ROCm needs the AMD Pro drivers which are painful to install and are proprietary

          • Domi
            link
            fedilink
            1410 months ago

            It does not.

            ROCm runs directly through the open source amdgpu kernel module, I use it every week.

            • Possibly linux
              link
              fedilink
              English
              3
              edit-2
              10 months ago

              How and with what card? I have a XFX RX590 and I just gave up on acceleration as it was slow even after I initially set it up.

              • Domi
                link
                fedilink
                9
                edit-2
                10 months ago

                I use an 6900 XT and run llama.cpp and ComfyUI inside of Docker containers. I don’t think the RX590 is officially supported by ROCm, there’s an environment variable you can set to enable support for unsupported GPUs but I’m not sure how well it works.

                AMD provides the handy rocm/dev-ubuntu-22.04:5.7-complete image which is absolutely massive in size but comes with everything needed to run ROCm without dependency hell on the host. I just build a llama.cpp and ComfyUI container on top of that and run it.