PetarV- / GAT

Graph Attention Networks (https://arxiv.org/abs/1710.10903)
https://petar-v.com/GAT/
MIT License
3.15k stars 643 forks source link

A questions about attention #30

Open pangsg opened 5 years ago

pangsg commented 5 years ago

In the paper,we only concat the Whj and Whi,how does it reflect the imformation exchange between node i and node j?

EtoDemerzel0427 commented 4 years ago

After the concatenation, GAT does a dot product of a tensor a and [Wh_i || Wh_j] to get a scalar