Open youngzyl opened 9 months ago
Same issue here with a 6750XT.
It loads most models correctly, but fails on the latest quantized Gemma models (e.g. https://huggingface.co/rahuldshetty/gemma-7b-it-gguf-quantized or https://huggingface.co/LoneStriker/gemma-7b-it-GGUF/tree/main).
Support for gemma has not been included in the 1.58 yet, you may have to try again next release.
This issue happens on the latest test build release of koboldcpp-rocm
Version: KoboldCPP-v1.56.yr1-ROCm
Computer specs: CPU: R5 5600 RAM: 32G GPU: RX6700XT