I run the quick inference demo of the github repo and follow the install pipeline without any other operation.
But I got this error.
Traceback (most recent call last):
File "/home/ssw/DemoFusion/base_models/InternLM-XComposer/projects/ShareGPT4V/demo_share4v.py", line 24, in
eval_model(args)
File "/home/ssw/DemoFusion/base_models/InternLM-XComposer/projects/ShareGPT4V/share4v/eval/run_share4v.py", line 31, in eval_model
tokenizer, model, image_processor, context_len = load_pretrained_model(
File "/home/ssw/DemoFusion/base_models/InternLM-XComposer/projects/ShareGPT4V/share4v/model/builder.py", line 114, in load_pretrained_model
tokenizer = AutoTokenizer.from_pretrained(
File "/home/ssw/anaconda3/envs/share4v/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 699, in from_pretrained
raise ValueError(
ValueError: Tokenizer class InternLMXComposerTokenizer does not exist or is not currently imported.
I run the quick inference demo of the github repo and follow the install pipeline without any other operation. But I got this error. Traceback (most recent call last): File "/home/ssw/DemoFusion/base_models/InternLM-XComposer/projects/ShareGPT4V/demo_share4v.py", line 24, in
eval_model(args)
File "/home/ssw/DemoFusion/base_models/InternLM-XComposer/projects/ShareGPT4V/share4v/eval/run_share4v.py", line 31, in eval_model
tokenizer, model, image_processor, context_len = load_pretrained_model(
File "/home/ssw/DemoFusion/base_models/InternLM-XComposer/projects/ShareGPT4V/share4v/model/builder.py", line 114, in load_pretrained_model
tokenizer = AutoTokenizer.from_pretrained(
File "/home/ssw/anaconda3/envs/share4v/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 699, in from_pretrained
raise ValueError(
ValueError: Tokenizer class InternLMXComposerTokenizer does not exist or is not currently imported.
Hope for your reply!