CarloLucibello / GraphNeuralNetworks.jl

Graph Neural Networks in Julia
https://carlolucibello.github.io/GraphNeuralNetworks.jl/dev/
MIT License
215 stars 46 forks source link

Merging multiple feature arrays #102

Closed casper2002casper closed 2 years ago

casper2002casper commented 2 years ago

I'm a bit confused about using multiple node feature arrays per graph. Using multiple node feature arrays allows keeping apart different features of the node (i.e. x and y values) however when trying to pass it through a layer it outputs an error. Is the intended use to keep all features in a single array? Couldn't all features arrays be merged?

This works

julia> l = GCNConv(2=>1)
julia> g = rand_graph(4, 6, ndata=(x = ones(2,4)))
julia> l(g)`
GNNGraph:
    num_nodes = 4
    num_edges = 6
    ndata:
        x => (1, 4)

This doesn't

julia> g = rand_graph(4, 6, ndata=(x = ones(4), y = zeros(4)))  
julia> l(g)  
┌ Error: Multiple feature arrays, access directly through g.ndata
└ @ GraphNeuralNetworks.GNNGraphs ~/.julia/packages/GraphNeuralNetworks/KNr8R/src/GNNGraphs/query.jl:321
ERROR: MethodError: no method matching (::GCNConv{Matrix{Float32}, Vector{Float32}, typeof(identity)})(::GNNGraph{Tuple{Vector{Int64}, Vector{Int64}, Nothing}}, ::Nothing)
CarloLucibello commented 2 years ago

Yes, typically one keeps all features in a single array (e.g. one matrix of size num_features x num_nodes).

julia> g = rand_graph(4, 6, ndata= vcat(ones(1,4), zeros(1,4)))
GNNGraph:
    num_nodes = 4
    num_edges = 6
    ndata:
        x => (2, 4)

julia> l = GCNConv(2=>1)
GCNConv(2 => 1)

julia> l(g)
GNNGraph:
    num_nodes = 4
    num_edges = 6
    ndata:
        x => (1, 4)

julia> l(g, g.ndata.x)
1×4 Matrix{Float64}:
 -1.65217  -1.07596  -1.07596  -1.07596

We allow the possibility to store separate feature arrays since in some applications they are handled in a different way (see equivariant graph neural networks for instance).

PS using julia code blocks will make your code mode readable

casper2002casper commented 2 years ago

I see, thank you