tingxueronghua / ChartLlama-code

MIT License
178 stars 17 forks source link

ERROR: RuntimeError: Internal: src/sentencepiece_processor.cc(1101) [model_proto->ParseFromArray(serialized.data(), serialized.size())] #12

Closed shllgtca closed 7 months ago

shllgtca commented 7 months ago

Hi!

I've been trying to run the inference example in the main page: https://github.com/tingxueronghua/ChartLlama-code?tab=readme-ov-file#-inference

But I've been getting the error: RuntimeError: Internal: src/sentencepiece_processor.cc(1101) [model_proto->ParseFromArray(serialized.data(), serialized.size())]

as model_path I'm using the path to the donwloaded repo: https://huggingface.co/listen2you002/ChartLlama-13b/tree/main

I added the model_base arg because the default arg was "/mnt/private_yucheng/huggingface_hub/llava-v1.5-13b". So, I'm using this variable as the path to the downloaded repo: https://huggingface.co/liuhaotian/llava-v1.5-13b/tree/main

Doing some research it looks like the error comes from a mismatch in weights source: https://github.com/facebookresearch/llama/issues/615

Although, both I downloaded from huggingface. Could you help me out on this one? Thanks in advance.

tingxueronghua commented 7 months ago

I am not sure about the causes. Just to confirm, whether you install the requirements by "pip install -e .[train]"? some dependencies are listed in [train].