Open tangent2018 opened 3 weeks ago
Hello, which version of autoawq are you using?
I use the model from https://www.modelscope.cn/linglingdan/MiniCPM-V_2_6_awq_int4.git I think autoawq is not necessary to run vllm?
I think you should try install my fork autoawq : git clone https://github.com/LDLINGLINGLING/AutoAWQ.git cd AutoAWQ pip install e .
same issue. The environment is: torch=2.4.0 torchvision=0.19.0 autoawq==0.2.6+cu121 autoawq_kernels==0.0.6 cuda==12.1
anyone know how to fix it?
I try to install AutoAWQ, but AutoAWQ need torch=2.3.1 while vllm(0.5.4) using torch=2.4.0. Perhaps I should build an envionment from zero rather than using docker image. If you have plan to build a docker image for MiniCPM-V_2_6_awq_int4 with vllm, please tell me.
ok,I've been very busy recently. I'll give it a try when I have time.
Load model weight error when run with MiniCPM-V_2_6_awq_int4
vllm environment: docker image: vllm/vllm-openai:v0.5.4
model download from: git clone https://www.modelscope.cn/linglingdan/MiniCPM-V_2_6_awq_int4.git
run code
trackback