issues
search
tatsuropfgt
/
papers
read paper memo
0
stars
0
forks
source link
On the Global Self-attention Mechanism for Graph Convolutional Networks
#13
Open
tatsuropfgt
opened
1 year ago
tatsuropfgt
commented
1 year ago
On the Global Self-attention Mechanism for Graph Convolutional Networks [
Wang+, 20
]
Abstract
Apply Global self-attention (GSA) to GCNs
GSA allows GCNs to capture feature-based vertex relations regardless of edge connections
Background
GSA mechanism on CNNs has achieved remarkable success
CNNs and GCNs are similar, so they tried to apply GSA to GCNs
Detail
experimented on node and graph classification tasks
outperform multiple carefully designed advanced methods
the impacts of overfitting and over-smoothing, two issues closely connected to graph edge structure
GSA can mitigate both issues
in section4, a theoretical analysis of these issues is discussed
tatsuropfgt
commented
1 year ago
On the Global Self-attention Mechanism for Graph Convolutional Networks [Wang+, 20]
Abstract
Background
Detail