cooelf / SemBERT

Semantics-aware BERT for Language Understanding (AAAI 2020)
https://arxiv.org/abs/1909.02209
MIT License
285 stars 55 forks source link

forward计算中只用了cls,所以bert后的cnn对齐还有必要吗?另外请教是否尝试过融入词性特征呢? #21

Closed jkkl closed 3 years ago

jkkl commented 3 years ago

first_token_tensor = sequence_output[:, 0]

cooelf commented 3 years ago

The alignment would be useful for improving the local interactions between the adjacent tokens. I simply concatenated them with the token embeddings, but did not see obvious improvements.

jkkl commented 3 years ago

@cooelf Thanks for your Reply。In my scenario,compared to bert-base-chinese, it has improved by abount 5 points.