Open llll111111 opened 6 days ago
We suspect that there might be some version mismatches between the libraries in your environment and our requirements. To verify, please cross-reference your library versions with the ones listed in https://github.com/OpenGVLab/InternVL/blob/main/requirements/internvl_chat.txt
@qishisuren123 I reinstalled the environment according to the link and it can run normally. thank you!
Checklist
Describe the bug
I followed https://internvl.readthedocs.io/en/latest/internvl1.5/finetune.html The tutorial reproduced the V1-5 model and encountered this error. How to solve this problem? What is the model for the vocabulary list?
Traceback (most recent call last): rank1: File "/data/ll/InternVL/internvl_chat/internvl/train/internvl_chat_finetune.py", line 848, in
rank1: File "/data/ll/InternVL/internvl_chat/internvl/train/internvl_chat_finetune.py", line 666, in main rank1: tokenizer = AutoTokenizer.from_pretrained( rank1: File "/root/anaconda3/envs/internvl/lib/python3.9/site-packages/transformers/models/auto/tokenization_auto.py", line 926, in from_pretrained rank1: raise ValueError( rank1: ValueError: Unrecognized configuration class <class 'transformers_modules.InternVL-Chat-V1-5.configuration_internvl_chat.InternVLChatConfig'> to build an AutoTokenizer.
Reproduction
GPUS=4 PER_DEVICE_BATCH_SIZE=2 sh shell/internvl1.5/2nd_finetune/internvl_chat_v1_5_internlm2_20b_dynamic_res_2nd_finetune_lora.sh
Environment
Error traceback
No response