Open kaattaalan opened 1 day ago
The compatible
keyword might not be working currently due to recent updates from LM Studios. This issue is
expected to be resolved in the LM studio next release.
Workaround for Now:
While waiting for the fix, you can still make it work by following these steps:
compatible
keyword:llama.dll
and ggml.dll
from Ollama for amd repository to your lmstudio extension works .Choosing the Correct Ollama Version:
Thanks a lot for the reply. I tried your Workaround and the models are loading and generating now. (I am getting GGGG output, but fixed it by turning Flash attention on)
But for some reason, it only works for the version 1.1.5 of the extension. If I try updating it to 1.1.10 (to see whether GGG output will be fixed) and do the same steps, it won't work anymore : I installed the extension by following the guide : https://github.com/lmstudio-ai/configs/blob/main/Extension-Pack-Instructions.md
Here's what I did (for reference)
Tried following the wiki https://github.com/likelovewant/ROCmLibs-for-gfx1103-AMD780M-APU/wiki/Unlock-LM-Studio-on-Any-AMD-GPU-with-ROCm-Guide#using-amd-graphics-cards-with-lm-studio
Copied the files, restarted LMS. The GPU (gfx1031) is still showing up as unsupported
When tried loading the model anyway, got the following error in LM studio :