Open cuppersd opened 1 year ago
I also encountered the same problem, which has been solved and successfully run the demo. You should check the size of the "tokenizer.model" file to make sure that it is properly downloaded manually from huggingface.
OK, thank you
Hello, I met the same bug, and the content of my "tokenizer.model" as followers: ''' version https://git-lfs.github.com/spec/v1 oid sha256:9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347 size 499723 ''' Should I install the "git LFS" first?
Same issue. Have you resolved it?
Traceback (most recent call last): File "/data/zhangjie_data_n/zhangjie_data/MiniGPT-4/demo.py", line 60, in
model = model_cls.from_config(model_config).to('cuda:{}'.format(args.gpu_id))
File "/data/zhangjie_data_n/zhangjie_data/MiniGPT-4/minigpt4/models/mini_gpt4.py", line 243, in from_config
model = cls(
File "/data/zhangjie_data_n/zhangjie_data/MiniGPT-4/minigpt4/models/mini_gpt4.py", line 86, in init
self.llama_tokenizer = LlamaTokenizer.from_pretrained(llama_model, use_fast=False)
File "/data/miniconda3/envs/minigpt4_1/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 1811, in from_pretrained
return cls._from_pretrained(
File "/data/miniconda3/envs/minigpt4_1/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 1965, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/data/miniconda3/envs/minigpt4_1/lib/python3.9/site-packages/transformers/models/llama/tokenization_llama.py", line 96, in init
self.sp_model.Load(vocab_file)
File "/data/miniconda3/envs/minigpt4_1/lib/python3.9/site-packages/sentencepiece/init.py", line 905, in Load
return self.LoadFromFile(model_file)
File "/data/miniconda3/envs/minigpt4_1/lib/python3.9/site-packages/sentencepiece/init.py", line 310, in LoadFromFile
return _sentencepiece.SentencePieceProcessor_LoadFromFile(self, arg)
RuntimeError: Internal: src/sentencepiece_processor.cc(1101) [model_proto->ParseFromArray(serialized.data(), serialized.size())]