Closed joneswong closed 3 years ago
Thanks for the contribution. Could you add this contribution to
https://github.com/snap-stanford/GraphGym/tree/master/graphgym/contrib/layer ?
There you can call it generaledgeconv_v2
or whatever name you like.
I may want to keep the current generalconv
as it is to ensure the reproducibility of existing results.
updated accordingly
gnn.flow
argument to enable message passing in either direction for directed graphs.GeneralConvLayer
to normalize the adjacency matrix of directed graphs according to the discussiongnn.self_msg=="none" and
gnn.normalizeadj==False`, no self loop would be added. Thus, in the current implementation, the message passing procedure ignored the node's embedding at previous layer, i.e., $$h{v}^{(l-1)}$$, which is inconsistent with the convention.I tested this pr:
the output matches what we expected:
if we changed to
cfg.gnn.flow="source_to_target"
as usual, it results in:which is also what we expected.