THUDM / GLM-4

GLM-4 series: Open Multilingual Multimodal Chat LMs | 开源多语言多模态对话模型
Apache License 2.0
3.28k stars 235 forks source link

BUG: 最新的glm4v代码使用transformers `AutoModelForCausalLM`类加载时强制要求安装`flash_attn` #239

Closed ChengjieLi28 closed 5 days ago

ChengjieLi28 commented 6 days ago

System Info / 系統信息

Python 3.10 Cuda 12.1

Who can help? / 谁可以帮助到您?

@zRzRzRzRzRzRzR

Information / 问题信息

Reproduction / 复现过程

MODEL_PATH='<path>'
model = AutoModelForCausalLM.from_pretrained(
            MODEL_PATH,
            low_cpu_mem_usage=True,
            trust_remote_code=True,
            torch_dtype=torch.float16,
            device_map="auto"
        )

报错大概是这个错误栈,transformers的错误栈:

File "/home/lichengjie/workspace/inference/xinference/model/llm/pytorch/glm4v.py", line 87, in load
    model = AutoModelForCausalLM.from_pretrained(
  File "/home/lichengjie/miniconda3/envs/xinf/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 550, in from_pretrained
    model_class = get_class_from_dynamic_module(
  File "/home/lichengjie/miniconda3/envs/xinf/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 501, in get_class_from_dynamic_module
    final_module = get_cached_module_file(
  File "/home/lichengjie/miniconda3/envs/xinf/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 326, in get_cached_module_file
    modules_needed = check_imports(resolved_module_file)
  File "/home/lichengjie/miniconda3/envs/xinf/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 181, in check_imports
    raise ImportError(
ImportError: [address=[0.0.0.0:33683](http://0.0.0.0:33683/), pid=1938627] This modeling file requires the following packages that were not found in your environment: flash_attn. Run `pip install flash_attn`

Expected behavior / 期待表现

没安装flash attention的库就走原来的逻辑。 这个库windows平台用不了。

zRzRzRzRzRzRzR commented 6 days ago
image

修改成这样,我之后更新

melonxi commented 6 days ago

你好,把modeling_chatglm.py的31-33行注释掉,运行代码device换成cpu,就可以了

zRzRzRzRzRzRzR commented 5 days ago

fix,已经pr,请更新最新的hf