THUDM / ChatGLM-6B

ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
Apache License 2.0
39.96k stars 5.15k forks source link

[BUG/Help] <title>chatglm-6b-int4, ptuning之后推理, 从chat换成generate后得到的输出为空 #1452

Open dzhengxin opened 5 months ago

dzhengxin commented 5 months ago

Is there an existing issue for this?

Current Behavior

chat正常有response, 但generate结果打印出来 token_id只比输入多了一个 5, 解码后为空 chat单句推理正常,改成generate进行单句推理/批量推理结果都为空

inputs = self._tokenizer(text_list, padding=True, return_tensors="pt") inputs = inputs.to(self._model.device) outputs = self._model.generate( **inputs, max_length=512, do_sample=False)

两种decode都为空 llm_outputs = list() for j, output in enumerate(outputs.tolist()): index = len(inputs["input_ids"][j]) output1 = output[index:] response = self._tokenizer.decode(output1, skip_special_tokens=True) llm_outputs.append(response)

llm_outputs2 = self._tokenizer.batch_decode(outputs)