Open CarloLucibello opened 1 year ago
Hi @CarloLucibello. Is this a documentation issue? I looked at the implementation in src/layers/heteroconv.jl
, and it seems like the implementation takes care of all standard graph conv layers (in the sense that, the layers
field in HeteroGraphConv
can be any GNNLayer
, and the code should work for any layer type). Am I missing something?
No it is actually an implementation issue that involves a tiny change to the forward pass of each layer. In the orginal PR (#300) it was done only for GraphConv
.
One needs to relax the forward signature to take an AbstractGNNGraph
instead of a GNNGraph
and then insert at the beginning the line
xj, xi = expand_srcdst(g, x)
since during the hetero message passing x
is a tuple containing the src and dst features (relative to two different node types).
Hi @CarloLucibello, another small question: for layers like AGNNConv
which involve self loops; such layers will then have to operate on two types of edges, right? Because, we'll have to add self-loops for nodes of the target type (this is being done currently in the AGNNConv
implementation: line 1018 of https://github.com/CarloLucibello/GraphNeuralNetworks.jl/blob/master/src/layers/conv.jl).
Also, motivated by this: it seems like adding self loops to GNNHeteroGraph
s is not supported. Will it be a good idea to add this functionality, wherein we can add self-loops for a particular node type?
When using something like AGNNConv inside an heterograph, the add_self_loop option should be set to false, since
it doesn't make sense to add self loops in a relation (node1_t, edge_t, node2_t)
, unless the two node types are the same.
Perhaps an additional HANConv layer would be helpful to add to this list. https://arxiv.org/abs/1903.07293
Am I correct in my thinking that these implementations for heterographs should be "type blind"?
HeteroGraphConv are build out of standard graph conv layers which are individually applied to the different relations. The list of layers supporting integration with HeteroGraphConv should be extended.
AGNNConv
CGConv
ChebConv
EGNNConv
EdgeConv
GATConv
GATv2Conv
GatedGraphConv
GCNConv
|GINConv
GMMConv
GraphConv
MEGNetConv
NNConv
ResGatedGraphConv
SAGEConv
SGConv
TransformerConv