InternLM / InternLM-XComposer

InternLM-XComposer-2.5: A Versatile Large Vision Language Model Supporting Long-Contextual Input and Output
2.14k stars 133 forks source link

The code in https://huggingface.co/internlm/internlm-xcomposer2-7b-4bit cannot be run successfully. #184

Closed zhulinJulia24 closed 4 months ago

zhulinJulia24 commented 4 months ago

https://huggingface.co/internlm/internlm-xcomposer2-7b-4bit

run the code in readme, find the following issues:

  1. quant_model is not initialized.
  2. auto_gptq.modeling do not have BaseGPTQForCausalLM. auto_gptq version is 0.7.0

cat /home/zhulin1/miniconda3/envs/lmdeployv25/lib/python3.10/site-packages/auto_gptq/modeling/__init__.py
from ._base import BaseQuantizeConfig
from .auto import GPTQ_CAUSAL_LM_MODEL_MAP, AutoGPTQForCausalLM
from .baichuan import BaiChuanGPTQForCausalLM
from .bloom import BloomGPTQForCausalLM
from .codegen import CodeGenGPTQForCausalLM
from .decilm import DeciLMGPTQForCausalLM
from .gpt2 import GPT2GPTQForCausalLM
from .gpt_bigcode import GPTBigCodeGPTQForCausalLM
from .gpt_neox import GPTNeoXGPTQForCausalLM
from .gptj import GPTJGPTQForCausalLM
from .internlm import InternLMGPTQForCausalLM
from .llama import LlamaGPTQForCausalLM
from .longllama import LongLlamaGPTQForCausalLM
from .mistral import MistralGPTQForCausalLM
from .mixtral import MixtralGPTQForCausalLM
from .moss import MOSSGPTQForCausalLM
from .opt import OPTGPTQForCausalLM
from .qwen import QwenGPTQForCausalLM
from .qwen2 import Qwen2GPTQForCausalLM
from .rw import RWGPTQForCausalLM
from .stablelmepoch import StableLMEpochGPTQForCausalLM
from .xverse import XverseGPTQForCausalLM
from .yi import YiGPTQForCausalLM
wanghanyang123 commented 4 months ago

和你一样的开发环境,我能运行但是耗时长,20秒处理一条。

zhulinJulia24 commented 4 months ago

和你一样的开发环境,我能运行但是耗时长,20秒处理一条。

@wanghanyang123 我这边能持续重现,不过我看了下auto_gptq的问题应该是他们的bug https://github.com/AutoGPTQ/AutoGPTQ/issues/552 我这边的auto_gptq是最新的0.7.0版本

zhulinJulia24 commented 4 months ago

我把auto_gptq降级到0.6.0,BaseGPTQForCausalLM的报错解决了 但readme里quant_model is not initialized的问题应该还在?

zhulinJulia24 commented 4 months ago

我把auto_gptq降级到0.6.0,BaseGPTQForCausalLM的报错解决了 但readme里quant_model is not initialized的问题应该还在?

已经pr修复了

wanghanyang123 commented 4 months ago

那你推理时长是多少?

zhulinJulia24 commented 4 months ago

那你推理时长是多少?

没有统计 有发现不同transformers版本耗时不同,试一试更新到最新版本呢