Closed sofia-lrf closed 1 month ago
/root/ChatGLM-6B/web_demo.py:44: SyntaxWarning: invalid escape sequence '`'
line = line.replace("", "\
")
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
Loading model from: /root/chatglm3-6b
Traceback (most recent call last):
File "/root/ChatGLM-6B/web_demo.py", line 7, in
另外在把模型拉取到本地以后,加载模型的时候还报了这个错,貌似不支持本地路径
已解决 是下载的模型文件有问题
Is there an existing issue for this?
Current Behavior
根据官方文档,部署执行,调用model时报错了
Expected Behavior
No response
Steps To Reproduce
1.根据之前的issue,重新执行安装了pip install transformers==4.33.0 2.在执行response, history = model.chat(tokenizer, "你好", history=[])的时候报了以下错 AttributeError Traceback (most recent call last) in <cell line: 1>()
----> 1 response, history = model.chat(tokenizer, "你好", history=[])
1 frames ~/.cache/huggingface/modules/transformers_modules/THUDM/chatglm3-6b/6f3b58ec10f088978ae174398f9d20b6dfc71552/modeling_chatglm.py in chat(self, tokenizer, query, history, role, max_length, num_beams, do_sample, top_p, temperature, logits_processor, kwargs) 1036 gen_kwargs = {"max_length": max_length, "num_beams": num_beams, "do_sample": do_sample, "top_p": top_p, 1037 "temperature": temperature, "logits_processor": logits_processor, kwargs} -> 1038 inputs = tokenizer.build_chat_input(query, history=history, role=role) 1039 inputs = inputs.to(self.device) 1040 eos_token_id = [tokenizer.eos_token_id, tokenizer.get_command("<|user|>"),
AttributeError: 'ChatGLMTokenizer' object has no attribute 'build_chat_input'
Environment
Anything else?
No response