snap-stanford / ogb

Benchmark datasets, data loaders, and evaluators for graph machine learning
https://ogb.stanford.edu
MIT License
1.89k stars 397 forks source link

A Question about GCN Implementation in the examples #348

Closed shuoyinn closed 1 year ago

shuoyinn commented 1 year ago

Hello, your work is such a great contribution to the community and I really appreciate it since it is of much benefit to my study.

But for me there is a tiny question: I am confused about your implementation of GCN, why would you set a root_emb as a learnable embedding or parameter within a GCN Conv? I saw it is used to add to a linearly projected x before a relu followed by degree normalization. Or, I want to know what's the function and motivation of the variable root_emb. Is it the first time to use this variable in literature or you just follow someone's work?

I thought it was related to virtual node method, but virtual node was already implemented in GNN_node_Virtualnode, so I still don't know what's the meaning of root_emb here.

image

weihua916 commented 1 year ago

Hi! Glad to hear you find OGB useful!

Note that the original GCN cannot incorporate edge feature information. Therefore, we extend the original GCN by making the message norm.view(-1, 1) * F.relu(x_j + edge_attr) (see here) instead of norm.view(-1, 1) * x_j. Then, you can think of self.root_emb as edge_attr analogue of the self-edge.

shuoyinn commented 1 year ago

@weihua916 Thanks for your such an prompt reply. Now I see maybe you want an analogue of x_j + edge_attr for self-loop (since self-loop is a default setting for a GCNConv), so you add x_i+edge_attr extra after aggregation of many x_j + edge_attr to realize it, where edge_attr for x_i's self-loop is not in raw data.edge_attr, so you just make it be a learnable or trainable. It helps a lot. Thanks, again =v=

weihua916 commented 1 year ago

Exactly, that's the intention. You are welcome.