PetarV- / GAT

Graph Attention Networks (https://arxiv.org/abs/1710.10903)
https://petar-v.com/GAT/
MIT License
3.25k stars 650 forks source link

how can I do the execute_cora.py without attention? #29

Open pyupya opened 5 years ago

pyupya commented 5 years ago

Hello! I'm really thanks to great paper and release code.

I'm a student interested in this GAT, and curious about difference in using attention or not. So I want to see the difference by execute the code 'execute_cora.py'. How can I do the test with attention, and without it? Thanks!

ghost commented 5 years ago

+1

TranSirius commented 5 years ago

Without Attention can be viewed as a average aggregation scheme. You may rewrite the code and substitude the attentiton scheme based aggregation matrix with an average aggregation matrix or a Laplacian matrix to get a no attention version