shikiw / OPERA

[CVPR 2024 Highlight] OPERA: Alleviating Hallucination in Multi-Modal Large Language Models via Over-Trust Penalty and Retrospection-Allocation
MIT License
244 stars 22 forks source link

stopping criteria #25

Closed yeonju7kim closed 3 months ago

yeonju7kim commented 3 months ago

Thank you for your great work.

https://github.com/shikiw/OPERA/blob/aa968c7501f4d3d8362f4b3bcab855024f4da5f6/transformers-4.29.2/src/transformers/generation/utils.py#L1628-L1629

I want to implement opera decoding, but I don't understand this part. Why do we need this? Because of this part, I can't go to the next step.

yeonju7kim commented 3 months ago

I solved. I found that if you use recent transformer, you should write the following line. If not, stopping_criteria.max_length is None.

stopping_criteria=prepared_stopping_criteria