Open BrambleXu opened 5 years ago
一句话总结:
证明了无监督下的pre-training对于consitituency parsing的学习是有效的。比较了不是context learning的fastText, 是context learning的ELMo和BERT。结果上来说,BERT>ELMo>fastText.
资源:
论文信息:
笔记:
看这篇文章的主要原因是这篇文章也算是说明了BERT里学习到了syntax信息
In this work, we study a broader range of pre-training conditions and experiment over a variety of languages, both jointly and individually.
模型图:
结果:
接下来要看的论文:
一句话总结:
证明了无监督下的pre-training对于consitituency parsing的学习是有效的。比较了不是context learning的fastText, 是context learning的ELMo和BERT。结果上来说,BERT>ELMo>fastText.
资源:
论文信息:
笔记:
看这篇文章的主要原因是这篇文章也算是说明了BERT里学习到了syntax信息
In this work, we study a broader range of pre-training conditions and experiment over a variety of languages, both jointly and individually.
模型图:
结果:
接下来要看的论文: