Open AragornHorse opened 2 months ago
Hi @AragornHorse, I will need more details about your environment and hardware to determine what the issue is.
hi, i have same question and i run Qwen2-7B-AWQ
hardware: ubunun22.04 v100 gpu
environment: autoawq=0.2.6 torch==2.3.1
我在运行codellama-7b-AWQ时遇到了同样的问题 硬件: v100 环境: awq==0.2.6 torch==2.4.0
I encountered the following error while using the quantized Qwen-72B
How to solve it?