Open johnshaughnessy opened 8 months ago
Try ROCm 5.7.1, I think 6.0.0 is too new for your GPU. Also "export HSA_OVERRIDE_GFX_VERSION=10.3.0" should work just fine on your GPU. Try using the nightly pytorch.
pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/rocm5.7
install windows and problem solve @johnshaughnessy
I have not been able to get
llamafile --gpu amd
working with anAMD Radeon RX 6900 XT
onlinux
. The relevant line of the log seems to be:full log
I found similar bug reports in other projects, so I suspect this is NOT a
llamafile
bug:JuliaGPU
ROCm-OpenCL-Runtime
gentoo
forumsROCm
ROCm
Instead, it seems that
ROCm
is not supported for my graphics card onlinux
:full rocminfo
Searching the
AMD
docs, I found:ROCm™ Software 6.0.0
does not list6900
series cards in their linux support matrix.ROCm™ Software 6.0.0
does list6900
series cards in their windows support matrix.I tried messing with the environment variable
HSA_OVERRIDE_GFX_VERSION
because I had seen that in some other issue reports, but did not have any luck.In case it's helpful, I kept a log the steps I took when setting things up.
To summarize, I installed ROCm for Arch Linux, but it seems that my graphics card (
Radeon RX 6900 XT
) is not supported byROCm
onlinux
, so I cannot use the--gpu amd
flag withllamafile
.If this is correct, then it is not a bug with
llamafile
. Still, I wanted to file this issue:Gotchas
section of theREADME.md
.