liuwei1206 / LEBERT

Code for the ACL2021 paper "Lexicon Enhanced Chinese Sequence Labelling Using BERT Adapter"
338 stars 60 forks source link

关于bilinear attention的疑问 #12

Closed yikedouer closed 3 years ago

yikedouer commented 3 years ago

请问为什么使用了bilinear attention? word embedding已经通过non-linear transformation与char embedding做到了维度对齐,是出于什么考虑使用了bilinear attention呢?

liuwei1206 commented 3 years ago

Hi,

I would like to recommend you to read the original paper, in which I have already shown my motivation to use bilinear attention.

Wei