Closed LFL38 closed 1 year ago
I don't think you can use GPTQ in CPU mode.. download the GGML and use llama.cpp or koboldcpp
I don't think you can use GPTQ in CPU mode.. download the GGML and use llama.cpp or koboldcpp
wait can I use GPT4xAlpaca on a RX 6600? it has 8gb vram, not sure how much GPT4xAlpaca needs, with oobabooga
Yea.. but you need to set up ROCM in linux or WSL (not sure how well it works here). And I think you may still have to do pre-layer.
This issue has been closed due to inactivity for 30 days. If you believe it is still relevant, please leave a comment below.
Describe the bug
I have installed oobabooga on the CPU mode but when I try to launch pygmalion it says "CUDA out of memory"
Is there an existing issue for this?
Reproduction
Run oobabooga pygmalion on the CPU mode.
Screenshot
No response
Logs