salesforce / BLIP

PyTorch code for BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation
BSD 3-Clause "New" or "Revised" License
4.86k stars 648 forks source link

Update med.py: Fixed the issue of BERTEncoder.forward() not returning cross-attentions when requested #171

Open programmingLearner opened 1 year ago

programmingLearner commented 1 year ago

Update med.py: Fixed the issue of BERTEncoder.forward() not returning cross-attentions when requested

In class BertEncoder.forward() method, all_cross_attentions is defined in Line 409, but not maintained, which causes a retuning of None object when requested. In this revision, all_cross_attentions is properly updated and maintained in Line 461. The maintenance code is referred from the original Hugging-face Transformer library https://github.com/huggingface/transformers/blob/v4.15.0/src/transformers/models/bert/modeling_bert.py Line600, and is tested to be valid.

salesforce-cla[bot] commented 1 year ago

Thanks for the contribution! Before we can merge this, we need @programmingLearner to sign the Salesforce Inc. Contributor License Agreement.