CarloLucibello / GraphNeuralNetworks.jl

Graph Neural Networks in Julia
https://carlolucibello.github.io/GraphNeuralNetworks.jl/dev/
MIT License
210 stars 47 forks source link

Layers support for HeteroGraphConv #311

Open CarloLucibello opened 1 year ago

CarloLucibello commented 1 year ago

HeteroGraphConv are build out of standard graph conv layers which are individually applied to the different relations. The list of layers supporting integration with HeteroGraphConv should be extended.

codetalker7 commented 11 months ago

Hi @CarloLucibello. Is this a documentation issue? I looked at the implementation in src/layers/heteroconv.jl, and it seems like the implementation takes care of all standard graph conv layers (in the sense that, the layers field in HeteroGraphConv can be any GNNLayer, and the code should work for any layer type). Am I missing something?

CarloLucibello commented 11 months ago

No it is actually an implementation issue that involves a tiny change to the forward pass of each layer. In the orginal PR (#300) it was done only for GraphConv. One needs to relax the forward signature to take an AbstractGNNGraph instead of a GNNGraph and then insert at the beginning the line

xj, xi = expand_srcdst(g, x)

since during the hetero message passing x is a tuple containing the src and dst features (relative to two different node types).

codetalker7 commented 10 months ago

Hi @CarloLucibello, another small question: for layers like AGNNConv which involve self loops; such layers will then have to operate on two types of edges, right? Because, we'll have to add self-loops for nodes of the target type (this is being done currently in the AGNNConv implementation: line 1018 of https://github.com/CarloLucibello/GraphNeuralNetworks.jl/blob/master/src/layers/conv.jl).

Also, motivated by this: it seems like adding self loops to GNNHeteroGraphs is not supported. Will it be a good idea to add this functionality, wherein we can add self-loops for a particular node type?

CarloLucibello commented 10 months ago

When using something like AGNNConv inside an heterograph, the add_self_loop option should be set to false, since it doesn't make sense to add self loops in a relation (node1_t, edge_t, node2_t), unless the two node types are the same.

AarSeBail commented 10 months ago

Perhaps an additional HANConv layer would be helpful to add to this list. https://arxiv.org/abs/1903.07293

AarSeBail commented 6 months ago

Am I correct in my thinking that these implementations for heterographs should be "type blind"?