InternLM / xtuner

An efficient, flexible and full-featured toolkit for fine-tuning LLM (InternLM2, Llama3, Phi3, Qwen, Mistral, ...)
https://xtuner.readthedocs.io/zh-cn/latest/
Apache License 2.0
3.35k stars 270 forks source link

ChatGLM3-6b测试模型时报错AttributeError: can't set attribute #636

Open padsasdasd opened 2 months ago

padsasdasd commented 2 months ago

xtuner chat /root/autodl-tmp/add --prompt-template default

Traceback (most recent call last): File "/root/ChatGLM3/xtuner/xtuner/tools/chat.py", line 491, in main() File "/root/ChatGLM3/xtuner/xtuner/tools/chat.py", line 237, in main tokenizer = AutoTokenizer.from_pretrained( File "/root/miniconda3/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 774, in from_pretrained return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, *kwargs) File "/root/miniconda3/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 2028, in from_pretrained return cls._from_pretrained( File "/root/miniconda3/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 2260, in _from_pretrained tokenizer = cls(init_inputs, init_kwargs) File "/root/.cache/huggingface/modules/transformers_modules/add/tokenization_chatglm.py", line 108, in init super().init(padding_side=padding_side, clean_up_tokenization_spaces=clean_up_tokenization_spaces, File "/root/miniconda3/lib/python3.8/site-packages/transformers/tokenization_utils.py", line 363, in init super().init(kwargs) File "/root/miniconda3/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1602, in init super().init(**kwargs) File "/root/miniconda3/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 861, in init setattr(self, key, value) AttributeError: can't set attribute

模型合并已经成功了,测试时一直出现错误,求大佬解答。

LZHgrla commented 2 months ago

@padsasdasd

这是一个chatglm3系列模型的已知问题,类似 issue 有 https://github.com/InternLM/xtuner/issues/221

可以考虑将模型中的 tokenizer 替换为训练前的 tokenizer 文件,以解决此问题