jshilong / GPT4RoI

GPT4RoI: Instruction Tuning Large Language Model on Region-of-Interest
Other
506 stars 25 forks source link

Pre-trained weights #34

Closed gavin-gqzhang closed 11 months ago

gavin-gqzhang commented 11 months ago

Hello, the download link of the pre-trained weights of huggingface you opened is not available, can you update it? Or other download channels. Thank you very much!

gavin-gqzhang commented 11 months ago

Hello, I changed the download channel of LLAMA, and then the tokenizer loading error occurred when merge delta, and the part of the error content is as follows:

File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 257, in _convert_token_to_id_with_added_voc return self.unk_token_id File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1155, in unk_token_id return self.convert_tokens_to_ids(self.unk_token) File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 250, in convert_tokens_to_ids return self._convert_token_to_id_with_added_voc(tokens) File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 257, in _convert_token_to_id_with_added_voc return self.unk_token_id File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1155, in unk_token_id return self.convert_tokens_to_ids(self.unk_token) File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 250, in convert_tokens_to_ids return self._convert_token_to_id_with_added_voc(tokens) File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 257, in _convert_token_to_id_with_added_voc return self.unk_token_id File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1155, in unk_token_id return self.convert_tokens_to_ids(self.unk_token) File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 250, in convert_tokens_to_ids return self._convert_token_to_id_with_added_voc(tokens) File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 257, in _convert_token_to_id_with_added_voc return self.unk_token_id File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1155, in unk_token_id return self.convert_tokens_to_ids(self.unk_token) File "/opt/anaconda3/envs/sgg_llava/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1035, in unk_token return str(self._unk_token) RecursionError: maximum recursion depth exceeded while getting the str of an object

Error caused by the reason is loaded in the delta_model tokenizer times wrong, I loaded the other model tokenizer all did not appear this problem, Probably due to a tokenizer issue in the repository.

hxx-who commented 5 months ago

I run into the same problem... How do you solve the maximum recursion depth exceeded while getting the str of an object problem?