They propose a method considering KG entities and KG neighbors via the attention mechanism.
2. What is amazing compared to previous works?
Their model considers not only the relations connected in the KG, but also KG neighbors that are not connected.
3. Where is the key to technologies and techniques?
3.1 Knowledge Graph Attention
Basically, the relational structure (header $h$, relation $r$, tail $t$) of KG embeddings $v_{KG}(h), vr(r), v{KG}(t)$ is defined as follows:
$$v_{KG}(t) \approx \phih (v{KG}(h), v_{r}(r))$$
In this paper, they define the structure using the attention mechanism.
0. Paper
1. What is it?
They propose a method considering KG entities and KG neighbors via the attention mechanism.
2. What is amazing compared to previous works?
Their model considers not only the relations connected in the KG, but also KG neighbors that are not connected.![スクリーンショット 2023-02-09 15 28 49](https://user-images.githubusercontent.com/45454055/217735126-2b8e649f-8f8b-424e-9816-0114f25a5ea0.png)
3. Where is the key to technologies and techniques?
3.1 Knowledge Graph Attention
Basically, the relational structure (header $h$, relation $r$, tail $t$) of KG embeddings $v_{KG}(h), vr(r), v{KG}(t)$ is defined as follows: $$v_{KG}(t) \approx \phih (v{KG}(h), v_{r}(r))$$
In this paper, they define the structure using the attention mechanism.
$$Q^i = v{KG}(w^i) \oplus v{PLM}(w^i)$$
$$K^{i, j} = V^{i, j} = v{KG}(w^i) \oplus v{PLM}(w^i)\ \ (\textrm{if}\ j = 0)$$
$$K^{i, j} = V^{i, j} = \phi (v_{KG}(w^i), vr(r^i)) \oplus v{PLM}(w^i)\ \ (\textrm{if}\ j > 0)$$
3.2 Overview
4. How did evaluate it?
From these results, the proposed (RoBERTa-based) method achieves higher performance than prior works.
Relation Classification Task![スクリーンショット 2023-02-09 16 00 51](https://user-images.githubusercontent.com/45454055/217740763-c47aa245-42e3-431b-8e13-e3f6b159b64e.png)
Entity Typing Task![スクリーンショット 2023-02-09 16 01 18](https://user-images.githubusercontent.com/45454055/217740844-d246aef3-0842-4f31-812d-ba05bf31dde9.png)
5. Is there a discussion?
6. Which paper should read next?