shikiw / OPERA

[CVPR 2024 Highlight] OPERA: Alleviating Hallucination in Multi-Modal Large Language Models via Over-Trust Penalty and Retrospection-Allocation
MIT License
244 stars 22 forks source link

AttributeError: 'MiniGPT4' object has no attribute 'embed_tokens' #5

Closed BillChan226 closed 8 months ago

BillChan226 commented 8 months ago

I try to reproduce your result using the mini_gpt4 backbone. It seems there's a bug with the /home/czr/contrast_decoding_LVLMs/sota_to_compare/OPERA/minigpt4/models/mini_gpt4.py file.

Here is the complete output:

Initializing Model Loading VIT Loading VIT Done Do not use Q-Former here. Loading LLAMA Loading checkpoint shards: 100%|████████████████████████████████████████████████████████████████████████████| 2/2 [00:17<00:00, 8.52s/it] Loading LLAMA Done Load BLIP2-LLM Checkpoint: /home/czr/contrast_decoding_LVLMs/model_checkpoints/pretrained_minigpt4_llama2_7b.pth Compose( Resize(size=(224, 224), interpolation=bicubic, max_size=None, antialias=warn) ToTensor() Lambda() ) Done! 0%| | 0/50 [00:02<?, ?it/s] Traceback (most recent call last): File "/home/czr/contrast_decoding_LVLMs/sota_to_compare/OPERA/chair_eval.py", line 174, in <module> out = model.generate( File "/home/czr/anaconda3/envs/minigptv/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "/home/czr/contrast_decoding_LVLMs/sota_to_compare/OPERA/minigpt4/models/mini_gpt4.py", line 362, in generate inputs_embeds, attention_mask, img_start_pos = self.prompt_wrap(img_embeds, atts_img, instruction) File "/home/czr/contrast_decoding_LVLMs/sota_to_compare/OPERA/minigpt4/models/mini_gpt4.py", line 226, in prompt_wrap p_before_embed = self.embed_tokens(p_before_tokens.input_ids) File "/home/czr/anaconda3/envs/minigptv/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1614, in __getattr__ raise AttributeError("'{}' object has no attribute '{}'".format( AttributeError: 'MiniGPT4' object has no attribute 'embed_tokens'

Could you please tell me what goes wrong so I could produce your reported result and cite your paper real quick. Appreciate it:)

shikiw commented 8 months ago

Hi,

Thanks for your appreciation! I'm sorry for the bug and I have fixed it by adding the following function: https://github.com/shikiw/OPERA/blob/dba0dda9457a3234d22ef4b60ea38b74a02d3905/minigpt4/models/mini_gpt4.py#L412-L417 It was my mistake that I accidentally deleted it when organizing the code. Please feel free to contact with me if you have any problems during reproducing our results:)

BillChan226 commented 8 months ago

Hi,

Thanks for your appreciation! I'm sorry for the bug and I have fixed it by adding the following function:

https://github.com/shikiw/OPERA/blob/dba0dda9457a3234d22ef4b60ea38b74a02d3905/minigpt4/models/mini_gpt4.py#L412-L417

It was my mistake that I accidentally deleted it when organizing the code. Please feel free to contact with me if you have any problems during reproducing our results:)

Thank you! It's running now.