tech-srl / bottleneck

Code for the paper: "On the Bottleneck of Graph Neural Networks and Its Practical Implications"
MIT License
91 stars 22 forks source link

Self loops in Tree-NeighborsMatch datasets #8

Closed jhonygiraldo closed 2 years ago

jhonygiraldo commented 2 years ago

Hi Uri,

I hope you're doing well. Your paper is really nice, congrats.

I just have a small question. If I correctly understood your code, you're generating several trees and then you stack them to train in a batch fashion. What I don't understand is why you add self loops to all nodes. The illustrations in the paper don't show any self loops. I guess the only difference is that you'll be using the own embedding of each node when performing message passing, isn't it? Or is there something I'm missing?

Thank you very much for releasing the code :)

urialon commented 2 years ago

Hi @jhonygiraldo, Thank you for your interest in our work and for your kind words.

Adding self-loops just performed slightly better, so we added them. The purpose of these experiments was to show that this simple-looking Tree-NeighborsMatch benchmark is difficult, no matter what tricks are used. So we added self-loops and experimented with many kinds of batch normalization, layer normalization, nonlinearities, residual connections, etc. Still, no model could perform well starting from a radius of 5.

Let me know if you have any questions! Best, Uri

jhonygiraldo commented 2 years ago

Great, thanks Uri, I understand now :)

Best, Jhony.