PetarV- / GAT

Graph Attention Networks (https://arxiv.org/abs/1710.10903)
https://petar-v.com/GAT/
MIT License
3.18k stars 642 forks source link

Why the second for loop in gat.py use h_1 as input of layers.attn_head rather than h_old ? #16

Closed svjack closed 5 years ago

svjack commented 5 years ago

The problem as title describe.

PetarV- commented 5 years ago

h_old seems to be a remnant of an older version of the code (when we were experimenting with various kinds of skip connections). It is useless in the current version (all operations done on h_1).