Open mikeperalta1 opened 2 weeks ago
When Mixtral came out I composed these instructions to get it to work - it includes details about llama.cpp such as compiling - https://github.com/nktice/AMD-AI/blob/main/Mixtral.md - I am wondering if you've tried such an install on your own system? Custom fresh install may allow for compiling that supports your card.
Hey I appreciate you getting back to me. I ended up solving the above error with an environment trick:
HSA_OVERRIDE_GFX_VERSION=10.3.0
llama.cpp still failed to utilize my card but at least it didn't crash at that point. Ended up jumping ship to ollama which worked right away.
Someone else posted about llama-cpp in another thread - https://github.com/nktice/AMD-AI/issues/7 So I figured out a new way to install it, and explained that there.
Nice, I'll check it out!
Any advice for an older 6600XT card that seems to want "gfx1032" for llama.cpp? I've tried with Ubuntu 24 and 23.10 and get this crash: