• TunaCowboy
    link
    fedilink
    286 months ago

    Look forward to this completing just in time for ROCm to drop support for your card.

    • Domi
      link
      fedilink
      76 months ago

      I use ROCm for inference, both text generation via llama.cpp/LMStudio and image generation via ComfyUI.

      Works pretty much perfectly on a 6900 XT. Very fast and easy to setup.

      I had issues with some libraries only supporting CUDA when trying to train, but that was almost 6 months ago so things probably have improved in that area as well.

  • Presi300
    link
    fedilink
    English
    105 months ago

    I’ve commented this before and I’ll do it again. DO NOT try to install ROCm/HIP, it’s a nightmare. AMD provides preconfigured docker containers with it already setup. Download one of them and do whatever you need to do on that.