mlpc-ucsd / BLIVA

(AAAI 2024) BLIVA: A Simple Multimodal LLM for Better Handling of Text-rich Visual Questions
https://arxiv.org/abs/2308.09936
BSD 3-Clause "New" or "Revised" License
257 stars 26 forks source link

Error "input lengths of input ids is 0" #21

Open NuiMrme opened 6 months ago

NuiMrme commented 6 months ago

Hello, The only thing that is different from the instruction of the installation is I used Vicuna v.1.5, it downloads the weights locally from HF. In the bliva_vicuna7b.yaml, llm_model: I set it to the locally downloaded weights folder", not sure if this the source of the problem. Otherwise, I used an image of the same size 224. Here is my traceback:

File "/home/user/BLIVA/evaluate.py", line 93, in <module> main(args) File "/home/user/BLIVA/evaluate.py", line 85, in main eval_one(image, question, model) File "/home/user/BLIVA/evaluate.py", line 46, in eval_one outputs = model.generate({"image": image, "prompt": question}) File "/home/user/conda_env/bliva/lib/python3.9/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, **kwargs) File "/home/user/BLIVA/bliva/models/bliva_vicuna7b.py", line 382, in generate outputs = self.llm_model.generate( File "/home/user/conda_env/bliva/lib/python3.9/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, **kwargs) File "/home/user/conda_env/bliva/lib/python3.9/site-packages/transformers/generation/utils.py", line 1447, in generate self._validate_generated_length(generation_config, input_ids_length, has_default_max_length) File "/home/user/conda_env/bliva/lib/python3.9/site-packages/transformers/generation/utils.py", line 1166, in _validate_generated raise ValueError( ValueError: Input length of input_ids is 0, butmax_lengthis set to -39. This can lead to unexpected behavior. You should consider length or, better yet, setting max_new_tokens.

Thanks in advance `

lendrick commented 5 months ago

The latest transformers seems to be the cause of this.

pip install transformers==4.28.0

Note: Using Vicuna 1.5 doesn't work, but isn't the cause of this error. You'll have to follow their instructions for applying the delta.

chuanshen-chen commented 2 months ago

The latest transformers seems to be the cause of this.

pip install transformers==4.28.0

Note: Using Vicuna 1.5 doesn't work, but isn't the cause of this error. You'll have to follow their instructions for applying the delta.

thanks!!!!