Open LifeIsStrange opened 5 years ago
This is not an issue, close it.
@huseinzol05 this is a feature request, a potential for improving xlnet results which would be major. Issues are not necessarily bugs..
If admins disagree, they can close it, I will not as attention on attention make sense and is probably the future of language models.
In this paper, we propose the Attention on Attention (AoA) module, an extension to conventional attention mechanisms, to address the irrelevant attention issue. Fur- thermore, we propose AoANet for image captioning by ap- plying AoA to both the encoder and decoder. Extensive ex- periments conducted on the MS COCO dataset demonstrate the superiority and general applicability of our proposed AoA module and AoANet. More remarkably, we achieve a new state-of-the-art performance for image captioning.
From https://paperswithcode.com/paper/attention-on-attention-for-image-captioning This seems like a generalist innovation to try!