Open eliranwong opened 3 months ago
The autotune feature of Triton is sensitive to the Triton version. Have you tried the rocm/pytorch [https://hub.docker.com/r/rocm/pytorch ] docker image? It comes preinstalled with the appropriate Triton version
I run ROCm 6.1.3 https://rocm.docs.amd.com/projects/radeon/en/latest/index.html, which officially supports AMD Radeon™ 7000 series GPUs. I think docker is running an older version, so it is not a choice for me.
with pytorch-triton-rocm
2.1.0+rocm6.1.3.4d510c3a44, officially provided by AMD:
https://rocm.docs.amd.com/projects/radeon/en/latest/docs/install/native_linux/install-pytorch.html
My setup: Dual AMD RX 7900 XTX + ROCm 6.1.3; full setup recorded at https://github.com/eliranwong/MultiAMDGPU_AIDev_Ubuntu
I following the fix at https://github.com/state-spaces/mamba/issues/412 to install mamba:
I ran the following command suggested in your repo, but encountered errors:
Full log of the errors: