CarloLucibello / GraphNeuralNetworks.jl

Graph Neural Networks in Julia
https://carlolucibello.github.io/GraphNeuralNetworks.jl/dev/
MIT License
210 stars 47 forks source link

`AGNNConv` behaviour different from mathematical definition (due to self loops) #325

Closed codetalker7 closed 10 months ago

codetalker7 commented 10 months ago

From the definition of AGNNConv, I do not expect the following behaviour:

using GraphNeuralNetworks
source = [1,1,2,2,3,3,3,4]
target = [2,3,1,3,1,2,4,3]
matrix = rand(32, 4)
g = GNNGraph(source, target, ndata=(; x = matrix))

has_self_loops(g)            # outputs false

agnn_layer = AGNNConv(1f0)

## applying agnn layer without self loops
without_self_loops = agnn_layer(g)

## adding self loops
g = GNNGraph(source, target, ndata=(; x = matrix))
g = add_self_loops(g)
has_self_loops(g)           # outputs true

## applying agnn layer with self loops
with_self_loops = agnn_layer(g)

## the following should pass, but fails
@assert with_self_loops.ndata[:x] == without_self_loops.ndata[:x]

I suspect this is because of the add_self_loops method used in the implementation of AGNNConv (line 1018 of https://github.com/CarloLucibello/GraphNeuralNetworks.jl/blob/master/src/layers/conv.jl). With this logic, the layer doesn't detect already existing self loops. If they do exist, it adds more self loops, which might be leading to this result.

cc @CarloLucibello.

codetalker7 commented 10 months ago

Also, while we're on the subject: from the definition of AGNNConv, the forward pass also includes multiplication of the features by a matrix $W$. But this doesn't seem to be the case from looking at the implementation (also, shouldn't the matrix $W$ be a field of the AGNNConv struct)?

CarloLucibello commented 10 months ago

It is true that there is no matrix W, the docstring will be fixed in #328. As for the fact that self loops are added even if they are already present, this is expected. This library supports multiedges, so it allows for the possibility of having multiple self loops. The layer behaves the same as the pytorch geometric counterpart. If a different behavior is wanted, than it is better to preprocess the graph as wished before passing it to the layer. In #328 I also add the option to the set add_self_loops=false.