THUDM / GLM-4

GLM-4 series: Open Multilingual Multimodal Chat LMs | 开源多语言多模态对话模型
Apache License 2.0
5.11k stars 424 forks source link

windows 使用魔塔社区的glm-4-9b-chat-fs 报错AttributeError: module 'signal' has no attribute 'SIGALRM' #613

Open xiezhipeng-git opened 3 days ago

xiezhipeng-git commented 3 days ago

System Info / 系統信息

windows

Who can help? / 谁可以帮助到您?

No response

Information / 问题信息

Reproduction / 复现过程

from transformers import AutoModelForCausalLM, AutoTokenizer
# from huggingface_hub import snapshot_download
from modelscope import snapshot_download

# 下载模型文件
# model_path = snapshot_download('Qwen/Qwen2.5-7B-Instruct')
model_path = snapshot_download('zhw2044154891/glm-4-9b-chat-fs')

# 加载模型和分词器
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(model_path)

# 设置设备
import torch
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
model.to(device)

Expected behavior / 期待表现

不报错。能正常使用 现在报错

AttributeError                            Traceback (most recent call last)
File d:\my\env\python3.10.10\lib\site-packages\transformers\dynamic_module_utils.py:648, in resolve_trust_remote_code(trust_remote_code, model_name, has_local_code, has_remote_code)
    [647](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/dynamic_module_utils.py:647) try:
--> [648](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/dynamic_module_utils.py:648)     prev_sig_handler = signal.signal(signal.SIGALRM, _raise_timeout_error)
    [649](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/dynamic_module_utils.py:649)     signal.alarm(TIME_OUT_REMOTE_CODE)

AttributeError: module 'signal' has no attribute 'SIGALRM'

During handling of the above exception, another exception occurred:

ValueError                                Traceback (most recent call last)
Cell In[2], [line 10](vscode-notebook-cell:?execution_count=2&line=10)
      [7](vscode-notebook-cell:?execution_count=2&line=7) model_path = snapshot_download('zhw2044154891/glm-4-9b-chat-fs')
      [9](vscode-notebook-cell:?execution_count=2&line=9) # 加载模型和分词器
---> [10](vscode-notebook-cell:?execution_count=2&line=10) tokenizer = AutoTokenizer.from_pretrained(model_path)
     [11](vscode-notebook-cell:?execution_count=2&line=11) model = AutoModelForCausalLM.from_pretrained(model_path)
     [13](vscode-notebook-cell:?execution_count=2&line=13) # 设置设备

File d:\my\env\python3.10.10\lib\site-packages\transformers\models\auto\tokenization_auto.py:879, in AutoTokenizer.from_pretrained(cls, pretrained_model_name_or_path, *inputs, **kwargs)
    [871](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/models/auto/tokenization_auto.py:871) has_remote_code = tokenizer_auto_map is not None
    [872](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/models/auto/tokenization_auto.py:872) has_local_code = type(config) in TOKENIZER_MAPPING or (
    [873](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/models/auto/tokenization_auto.py:873)     config_tokenizer_class is not None
    [874](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/models/auto/tokenization_auto.py:874)     and (
   (...)
    [877](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/models/auto/tokenization_auto.py:877)     )
    [878](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/models/auto/tokenization_auto.py:878) )
--> [879](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/models/auto/tokenization_auto.py:879) trust_remote_code = resolve_trust_remote_code(
    [880](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/models/auto/tokenization_auto.py:880)     trust_remote_code, pretrained_model_name_or_path, has_local_code, has_remote_code
    [881](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/models/auto/tokenization_auto.py:881) )
    [883](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/models/auto/tokenization_auto.py:883) if has_remote_code and trust_remote_code:
    [884](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/models/auto/tokenization_auto.py:884)     if use_fast and tokenizer_auto_map[1] is not None:

File d:\my\env\python3.10.10\lib\site-packages\transformers\dynamic_module_utils.py:664, in resolve_trust_remote_code(trust_remote_code, model_name, has_local_code, has_remote_code)
    [661](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/dynamic_module_utils.py:661)     signal.alarm(0)
    [662](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/dynamic_module_utils.py:662) except Exception:
    [663](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/dynamic_module_utils.py:663)     # OS which does not support signal.SIGALRM
--> [664](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/dynamic_module_utils.py:664)     raise ValueError(
    [665](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/dynamic_module_utils.py:665)         f"The repository for {model_name} contains custom code which must be executed to correctly "
    [666](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/dynamic_module_utils.py:666)         f"load the model. You can inspect the repository content at https://hf.co/{model_name}.\n"
    [667](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/dynamic_module_utils.py:667)         f"Please pass the argument `trust_remote_code=True` to allow custom code to be run."
    [668](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/dynamic_module_utils.py:668)     )
    [669](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/dynamic_module_utils.py:669) finally:
    [670](file:///D:/my/env/python3.10.10/lib/site-packages/transformers/dynamic_module_utils.py:670)     if prev_sig_handler is not None:

ValueError: The repository for C:\Users\Admin\.cache\modelscope\hub\zhw2044154891\glm-4-9b-chat-fs contains custom code which must be executed to correctly load the model. You can inspect the repository content at [https://hf.co/C:\Users\Admin\.cache\modelscope\hub\zhw2044154891\glm-4-9b-chat-fs](https://hf.co/C:/Users/Admin/.cache/modelscope/hub/zhw2044154891/glm-4-9b-chat-fs).
Please pass the argument `trust_remote_code=True` to allow custom code to be run.
zRzRzRzRzRzRzR commented 3 days ago

这个fs不是我们的模型吧,另外,降级到4.46以下,4.46开始不兼容了

xiezhipeng-git commented 2 days ago

@zRzRzRzRzRzRzR 什么意思?你的意思是魔塔社区上这个模型不是你们官方放上去的?那你们会放模型文件上去吗?方便国内用户下载。另外,降级也不合理啊。升级才合理。总不能因为一个模型的问题,回退其他已解决的问题吧