shikiw / OPERA

[CVPR 2024 Highlight] OPERA: Alleviating Hallucination in Multi-Modal Large Language Models via Over-Trust Penalty and Retrospection-Allocation
MIT License
244 stars 22 forks source link

reproducing shikra problem #22

Closed KlaineWei closed 4 months ago

KlaineWei commented 4 months ago

When reproducing results of shikra, the evaluation outputs are wrong, as following: Done! load data finished Start eval... 0%| | 0/2910 [00:00<?, ?it/s]/root/autodl-tmp/OPERA/transformers-4.29.2/src/transformers/generation/utils.py:1262: UserWarning: You have modified the pretrained model configuration to control generation. This is a deprecated strategy to control generation and will be removed soon, in a future version. Please use a generation configuration file (see https://huggingface.co/docs/transformers/main_classes/text_generation) warnings.warn( noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer 0%| | 1/2910 [00:06<5:10:01, 6.39s/it] noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer 0%| | 2/2910 [00:10<4:06:04, 5.08s/it] noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer 0%| | 3/2910 [00:14<3:45:29, 4.65s/it] noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer 0%|▏ | 4/2910 [00:18<3:35:52, 4.46s/it] noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer noreferrer 0%|▏ | 5/2910 [00:23<3:30:31, 4.35s/it]