WisdomShell / codeshell

A series of code large language models developed by PKU-KCL
http://se.pku.edu.cn/kcl
Other
1.62k stars 119 forks source link

unsupported operand type(s) for -: 'int' and 'NoneType' #22

Closed JudeZzz1997 closed 11 months ago

JudeZzz1997 commented 11 months ago

TypeError Traceback (most recent call last) Cell In[20], line 3 1 history = [] 2 query = '你是谁?' ----> 3 response = model.chat(query, history, tokenizer) 4 print(response) 5 history.append((query, response))

File ~/.cache\huggingface\modules\transformers_modules\CodeShell-7B-Chat\modeling_codeshell.py:971, in CodeShellForCausalLM.chat(self, query, history, tokenizer, stream, generation_config) 968 def chat(self, query, history, tokenizer, stream=False, 969 generation_config: Optional[GenerationConfig]=None): 970 generation_config = generation_config or self.generation_config --> 971 input_ids = self.build_chat_input(query, history, tokenizer, generation_config.max_new_tokens) 972 stopping_criteria = StoppingCriteriaList( 973 [EndOfFunctionCriteria([len(input_ids[0])], ['||', '<|endoftext|>'], tokenizer)] 974 ) 976 if stream:

File ~/.cache\huggingface\modules\transformers_modules\CodeShell-7B-Chat\modeling_codeshell.py:962, in CodeShellForCausalLM.build_chat_input(self, query, history, tokenizer, max_new_tokens) 959 prompt += ai_name.rstrip() 961 max_new_tokens = max_new_tokens or self.generation_config.max_new_tokens --> 962 max_input_tokens = self.config.n_positions - max_new_tokens 964 input_tokens = tokenizer.encode(prompt) 965 input_tokens = input_tokens[-max_input_tokens:] # truncate left

ruixie commented 11 months ago

Fixed. 请更行模型代码至最新版本。

https://huggingface.co/WisdomShell/CodeShell-7B-Chat/blob/main/modeling_codeshell.py