Open ParisNeo opened 11 months ago
I got that when torch wasn't CUDA-enabled.
Check if running
import torch torch.cuda.is_available()
returns TRUE in your python environment. If not, try installing pytorch from https://pytorch.org/get-started/locally/ .
wow, thank you sir. Been searching and reinstalling oobabooga over and over trying to fix the same issue. Going from your tip and installing pytorch in the conda terminal solved the issue for me. (Win 11)
Sorry I forgot this issue. It was fixed long ago. Now LoLLMs supports AWQ models without any problem. Thanks.
Sorry I forgot this issue. It was fixed long ago. Now LoLLMs supports AWQ models without any problem. Thanks.
Recently I met the similar situation. And "torch.cuda.is_available()" returns True in my anaconda environment. Could you please tell me how you fix it? Thanks.
Hi. Sorry didn't see your message. I just upgraded everything to new cuda 12.1, new torch 2.1 and reinstalled transformers. Since then every thing works fine.
Hi. I am building a binding for AWQ in lollms and I get this problem after I install it:
ImportError: DLL load failed while importing awq_inference_engine
The error seems to come from this: awq\modules\linear.py", line 4, in
import awq_inference_engine # with CUDA kernels