LostRuins / koboldcpp

Run GGUF models easily with a KoboldAI UI. One File. Zero Install.
https://github.com/lostruins/koboldcpp
GNU Affero General Public License v3.0
4.99k stars 349 forks source link

make LLAMA_HIPBLAS=1 gives endless warnings on Koboldcpp-Rocm #661

Open SusieDreemurr opened 8 months ago

SusieDreemurr commented 8 months ago

I'm trying to run on gpu only. I'm getting these endless warnings when I run the command make LLAMA_HIPBLAS=1 && \ ./koboldcpp.py Even if I ignore the warning and run ./koboldcpp.py on its own, hipblas doesn't show up on there. My only option is "Use No BLAS"

I'm using an AMD Radeon VII GPU.

Kobold Error 2

YellowRoseCx commented 7 months ago

The warnings are normal, do you get any errors (they'll be in red) when building?

SusieDreemurr commented 7 months ago

The warnings are normal, do you get any errors (they'll be in red) when building?

No errors, but if the warnings are normal, then never mind then. I just thought something went wrong. I managed to get it working now.

AdrianGroty commented 6 months ago

No errors, but if the warnings are normal, then never mind then. I just thought something went wrong. I managed to get it working now.

How did you get it working? I can only get it to run with noblas, and hipblas isn't showing up in my list, either.