BrambleXu / knowledge-graph-learning

A curated list of awesome knowledge graph tutorials, projects and communities.
MIT License
736 stars 120 forks source link

arXiv-2019/06-Extracting Multiple-Relations in One-Pass with Pre-Trained Transformers #232

Open BrambleXu opened 5 years ago

BrambleXu commented 5 years ago

一句话总结:

主要针对multiple entity-relations from an input paragraph,解决了 multiple-passes issue。用MRE对input只encoding一次,提高了效率和可扩展性。基于BERT,但对原来的BERT做了一些修正。1 引入了一个structured prediction layer用于预测multole relations for different entity pairs. 2 令self-attention layer意识到整个paragraph里所有的entities position

资源:

论文信息:

笔记:

3.2 Entity-Aware Self-Attention based on Relative Distance

image

image

模型图:

image

结果

image

image (这个是不是也说明用简单的classifier会比较好)

image

接下来要看的论文