BrambleXu / knowledge-graph-learning

A curated list of awesome knowledge graph tutorials, projects and communities.
MIT License
736 stars 120 forks source link

ACL-2019/06-Multilingual Constituency Parsing with Self-Attention and Pre-Training #229

Open BrambleXu opened 5 years ago

BrambleXu commented 5 years ago

一句话总结:

证明了无监督下的pre-training对于consitituency parsing的学习是有效的。比较了不是context learning的fastText, 是context learning的ELMo和BERT。结果上来说,BERT>ELMo>fastText.

资源:

论文信息:

笔记:

看这篇文章的主要原因是这篇文章也算是说明了BERT里学习到了syntax信息

In this work, we study a broader range of pre-training conditions and experiment over a variety of languages, both jointly and individually.

image

模型图:

image

结果

image

接下来要看的论文