BrambleXu / knowledge-graph-learning

A curated list of awesome knowledge graph tutorials, projects and communities.
MIT License
735 stars 120 forks source link

EMNLP-2019/11-Improving Relation Extraction with Knowledge-attention #292

Open BrambleXu opened 4 years ago

BrambleXu commented 4 years ago

Summary:

使用新提出的knowledge-attention encoder将外部的lexical信息导入到RE任务中。

Resource:

Paper information:

Notes:

1 Introduction

A recent study (Li and Mao, 2019) shows that incorporating prior knowledge from external lexical resources into deep neural network can reduce the reliance on training data and improve relation extraction performance.

受到上面的研究的启发,这篇文章提出了一个模型,从 lexical resources 中捕捉用于RE的linguistic clues.

5.3.5 Error analysis

这部分提到了false positive的问题,说到了TACRED这个数据集的一些问题,比如multiple entities with different relations co-occurred in one sentence(都在一个句子里,但实际标注却只标注了一个pair)。另外还有imperfect annotation.

Model Graph:

Result:

Thoughts:

Next Reading: