Closed kaattaalan closed 1 month ago
The compatible
keyword might not be working currently due to recent updates from LM Studios. This issue is
expected to be resolved in the LM studio next release.
Workaround for Now:
While waiting for the fix, you can still make it work by following these steps:
compatible
keyword:llama.dll
and ggml.dll
from Ollama for amd repository to your lmstudio extension works .Choosing the Correct Ollama Version:
Thanks a lot for the reply. I tried your Workaround and the models are loading and generating now. (I am getting GGGG output, but fixed it by turning Flash attention on)
But for some reason, it only works for the version 1.1.5 of the extension. If I try updating it to 1.1.10 (to see whether GGG output will be fixed) and do the same steps, it won't work anymore : I installed the extension by following the guide : https://github.com/lmstudio-ai/configs/blob/main/Extension-Pack-Instructions.md
Here's what I did (for reference)
1.1.10 need update llama.dll
and ggml.dll
from ollama build with rocm5.7 ,however , the files from ollama for amd build with 6.1.2 . The possible Solution would be build ollama yourself with rocm 5.7 and grab the files from the build or wait for the next release which may include 5.7 .
Thanks a ton for the reply. I don't mind waiting for the next build, while I will be experimenting with ollama for some days. Thanks for all the hard work.
had update the ollama-windows-amd64-rocm-5.7.7z ,are able to work with 1.1.10 entension for LM Studio. Feel free to test it .
Just tested ollama 0.3.11(5.7.7 build) with rocm extension 1.1.10. Models are not loading. The only combo that seems to work is 0.3.6 with 1.1.5.
it's a bit tricky . you also need to re-edit information and replace files as steps before .if you missing one of the steps or use the libs from rocm 6.1.2 , it won't load .sometime , latest drivers help.
Tried following the wiki https://github.com/likelovewant/ROCmLibs-for-gfx1103-AMD780M-APU/wiki/Unlock-LM-Studio-on-Any-AMD-GPU-with-ROCm-Guide#using-amd-graphics-cards-with-lm-studio
Copied the files, restarted LMS. The GPU (gfx1031) is still showing up as unsupported
When tried loading the model anyway, got the following error in LM studio :