shikiw / OPERA

[CVPR 2024 Highlight] OPERA: Alleviating Hallucination in Multi-Modal Large Language Models via Over-Trust Penalty and Retrospection-Allocation
MIT License
244 stars 22 forks source link

Questions about function prepare_inputs_labels_for_multimodal #24

Closed KlaineWei closed 1 month ago

KlaineWei commented 4 months ago

Hi, when i was running the code, i found most of the time the code hit the situation in line 248 of function prepare_inputs_labels_for_multimodal(OPERA/minigpt4/models/llava_arch.py), does it mean there is only one token in the prompt?Why does it happen?

shikiw commented 4 months ago

Hi,

This part of the code is LLaVA's official code. Do you have any specific error logs? I want to know how you run the code, so I can help you based on these details :)