Closed codetalker7 closed 10 months ago
Also, while we're on the subject: from the definition of AGNNConv
, the forward pass also includes multiplication of the features by a matrix $W$. But this doesn't seem to be the case from looking at the implementation (also, shouldn't the matrix $W$ be a field of the AGNNConv
struct)?
It is true that there is no matrix W, the docstring will be fixed in #328.
As for the fact that self loops are added even if they are already present, this is expected. This library supports multiedges, so it allows for the possibility of having multiple self loops. The layer behaves the same as the pytorch geometric counterpart.
If a different behavior is wanted, than it is better to preprocess the graph as wished before passing it to the layer.
In #328 I also add the option to the set add_self_loops=false
.
From the definition of
AGNNConv
, I do not expect the following behaviour:I suspect this is because of the
add_self_loops
method used in the implementation ofAGNNConv
(line 1018 of https://github.com/CarloLucibello/GraphNeuralNetworks.jl/blob/master/src/layers/conv.jl). With this logic, the layer doesn't detect already existing self loops. If they do exist, it adds more self loops, which might be leading to this result.cc @CarloLucibello.