PetarV- / GAT

Graph Attention Networks (https://arxiv.org/abs/1710.10903)
https://petar-v.com/GAT/
MIT License
3.2k stars 644 forks source link

question about attention layer #39

Closed caizhenhui closed 4 years ago

caizhenhui commented 4 years ago

HI! Acccording to paper,Attention only calculated between node's neighboor.I am confused when i see code about "attention_layer", it is calculated on all nodes not only neighboor node?

PetarV- commented 4 years ago

Hello,

Thank you for the issue and your interest in GAT!

You are correct that the coefficients are calculated for every node pair. However, before applying the softmax function, we add the bias matrix (bias_mat), setting all non-edge pairs to "negative infinity" (-1e9). This forces the softmax to assign them a weight of zero, effectively discarding them.

Hope this helps!

Thanks, Petar

caizhenhui commented 4 years ago

Thank you!