BrambleXu / knowledge-graph-learning

A curated list of awesome knowledge graph tutorials, projects and communities.
MIT License
736 stars 120 forks source link

SIGIR(WS)-2019/07-Do Transformer Attention Heads Provide Transparency in Abstractive Summarization? #251

Open BrambleXu opened 5 years ago

BrambleXu commented 5 years ago

Summary:

分析Tansformer attention head在abstractive summarization上的透明性。引用了 #235。从模型可解释性的角度去利用self-attention也是一个不错的方向。

Resource:

Paper information:

Notes:

Model Graph:

Result:

Thoughts:

Next Reading: