cooelf / SemBERT

Semantics-aware BERT for Language Understanding (AAAI 2020)
https://arxiv.org/abs/1909.02209
MIT License
286 stars 55 forks source link

Why is the SRL information not used as an additional embedding layer. #17

Closed LinMu7177 closed 4 years ago

LinMu7177 commented 4 years ago

First of all, thank you for being open source! It isn't clear to me why the info of SRL is not used as additional embedding layers but is rather aggregated separately. Have you tried it before? Hope you can share your opinion, thank you very much!

cooelf commented 4 years ago

Hi, I did not try it. Just think that revising the internal embedding layer might influence the parameter structure of the pre-trained BERT. Anyway, it is a good attempt when having time.

LinMu7177 commented 4 years ago

Okay, thank you for your reply!!!