Closed gavin-gqzhang closed 11 months ago
Hello, I changed the download channel of LLAMA, and then the tokenizer loading error occurred when merge delta, and the part of the error content is as follows:
File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 257, in _convert_token_to_id_with_added_voc return self.unk_token_id File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1155, in unk_token_id return self.convert_tokens_to_ids(self.unk_token) File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 250, in convert_tokens_to_ids return self._convert_token_to_id_with_added_voc(tokens) File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 257, in _convert_token_to_id_with_added_voc return self.unk_token_id File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1155, in unk_token_id return self.convert_tokens_to_ids(self.unk_token) File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 250, in convert_tokens_to_ids return self._convert_token_to_id_with_added_voc(tokens) File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 257, in _convert_token_to_id_with_added_voc return self.unk_token_id File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1155, in unk_token_id return self.convert_tokens_to_ids(self.unk_token) File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 250, in convert_tokens_to_ids return self._convert_token_to_id_with_added_voc(tokens) File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 257, in _convert_token_to_id_with_added_voc return self.unk_token_id File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1155, in unk_token_id return self.convert_tokens_to_ids(self.unk_token) File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1035, in unk_token return str(self._unk_token) RecursionError: maximum recursion depth exceeded while getting the str of an object
Error caused by the reason is loaded in the delta_model tokenizer times wrong, I loaded the other model tokenizer all did not appear this problem, Probably due to a tokenizer issue in the repository.
I run into the same problem... How do you solve the maximum recursion depth exceeded while getting the str of an object
problem?
Hello, the download link of the pre-trained weights of huggingface you opened is not available, can you update it? Or other download channels. Thank you very much!