A curated list of awesome knowledge graph tutorials, projects and communities.
733
stars
121
forks
source link
ICLR-2020-StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding #360
Open
BrambleXu opened 1 year ago
Summary:
给BERT的预训练又添加了两个额外的任务,一个是句子内用于学习单词顺序的任务,一个是不同句子间用于学习句子间顺序的任务。
Resource:
Paper information:
Notes:
Model Graph:
Result::
Thoughts:
从设计上来说,这个模型对于相似度的贡献应该比较小
Next Reading: