Closed allrobot closed 3 months ago
Had exactly the same error after updating text-generation-webui. Updating CUDA from 11 to 12 (latest) solved this problem for me.
Had exactly the same error after updating text-generation-webui. Updating CUDA from 11 to 12 (latest) solved this problem for me.
So, I forgot to add the CUDA 12 path to the environment variables, and after adding it, it showed that the loading was successful.
Hello, I was wondering allrobot, if you can tell me specifically how you solved this issue. I appear to be having it as well and everything I have tried has proved fruitless. I'm not very proficient in these matters but everything seems to be up to date and despite beginning from scratch my webui still gives the very error you were having.
Could you walk me through adding the CUDA 12 path to the environment variables? Simply updating to 12 did not solve the issue. Thanks in advance for any assistance you can give.
Describe the bug
Unable to load the model normally, but llama-cpp-python can load the model without issues. I don't know why llama.cpp from text-generation-webui cannot load the model, showing an error that llama.dll is not found, even though this DLL does actually exist.
Is there an existing issue for this?
Reproduction
Download the TheBloke/CausalLM-14B-GGUF model, then switch to the Model interface and select llama.cpp on http://127.0.0.1:7860/. However, it throws an error after loading the model.
Screenshot
Logs
System Info