FluxML / GeometricFlux.jl

Geometric Deep Learning for Flux
https://fluxml.ai/GeometricFlux.jl/stable/
MIT License
348 stars 30 forks source link

Uninformative Error when using GATConv / GATv2Conv when using WithGraph #348

Open hv10 opened 1 year ago

hv10 commented 1 year ago

Package Version

[7e08b658] GeometricFlux v0.13.7

Julia Version

1.8.0

OS / Environment

MacOS 12.6 Monterey, M1 Mac. Ubuntu based Linux (on Headless Server).

Describe the bug

When using the GATConv or GATv2Conv isolated nodes result in an error, as the message passing for GAT convolutions does not catch the edge case where a node has no reachable neighbours, when using WithGraph. When using a dynamic graph approach it does fail with the error that vertices need to have a self loop.

Steps to Reproduce

using GeometricFlux
using Graphs
g = erdos_renyi(20,0.1) # generate random graph
add_vertex!(g) # ensure at least one isolated node exists
fg = FeaturedGraph(g)
m = WithGraph(fg, GATConv(2=>1))
m(rand(2,21)) # leads to the uninformative error

Expected Results

I would have expected to either get the same error as with the dynamic case, or that the Layer just adds self loops anyways (although I understand that this would be very ambigous in practice).

Observed Results

The error is highly uninformative of the actual problem (the wrong input graph type) when using the Static Graph approach.

Relevant log output

WithGraphs case:

ERROR: ArgumentError: reducing over an empty collection is not allowed
Stacktrace:
  [1] _empty_reduce_error()
    @ Base ./reduce.jl:306
  [2] reducedim_init(f::Function, op::typeof(max), A::SubArray{Float64, 4, Array{Float64, 4}, Tuple{Base.Slice{Base.OneTo{Int64}}, Base.Slice{Base.OneTo{Int64}}, Vector{Int64}, Base.Slice{Base.OneTo{Int64}}}, false}, region::Int64)
    @ Base ./reducedim.jl:134
  [3] _mapreduce_dim
    @ ./reducedim.jl:371 [inlined]
  [4] #mapreduce#764
    @ ./reducedim.jl:357 [inlined]
  [5] #_maximum#806
    @ ./reducedim.jl:1023 [inlined]
  [6] _maximum
    @ ./reducedim.jl:1023 [inlined]
  [7] #_maximum#805
    @ ./reducedim.jl:1022 [inlined]
  [8] _maximum
    @ ./reducedim.jl:1022 [inlined]
  [9] #maximum#779
    @ ./reducedim.jl:994 [inlined]
 [10] softmax!(out::SubArray{Float64, 4, Array{Float64, 4}, Tuple{Base.Slice{Base.OneTo{Int64}}, Base.Slice{Base.OneTo{Int64}}, Vector{Int64}, Base.Slice{Base.OneTo{Int64}}}, false}, x::SubArray{Float64, 4, Array{Float64, 4}, Tuple{Base.Slice{Base.OneTo{Int64}}, Base.Slice{Base.OneTo{Int64}}, Vector{Int64}, Base.Slice{Base.OneTo{Int64}}}, false}; dims::Int64)
    @ NNlib ~/.julia/packages/NNlib/0QnJJ/src/softmax.jl:61
 [11] #softmax!#159
    @ ~/.julia/packages/NNlib/0QnJJ/src/softmax.jl:58 [inlined]
 [12] indexed_softmax(x::Array{Float64, 4}, xs::Vector{UInt32}, N::Int64; dims::Int64)
    @ GeometricFlux ~/.julia/packages/GeometricFlux/XpzlC/src/operation.jl:40
 [13] update_batch_edge(gat::GATConv{Float32, Matrix{Float32}, Vector{Float32}, typeof(identity)}, el::NamedTuple{(:N, :E, :es, :nbrs, :xs), Tuple{Int64, Int64, Vector{UInt32}, Vector{UInt32}, Vector{UInt32}}}, E::Nothing, X::Array{Float64, 3}, u::Nothing)
    @ GeometricFlux ~/.julia/packages/GeometricFlux/XpzlC/src/layers/graph_conv.jl:310
 [14] update_batch_edge(gat::GATConv{Float32, Matrix{Float32}, Vector{Float32}, typeof(identity)}, el::NamedTuple{(:N, :E, :es, :nbrs, :xs), Tuple{Int64, Int64, Vector{UInt32}, Vector{UInt32}, Vector{UInt32}}}, E::Nothing, X::Matrix{Float64}, u::Nothing)
    @ GeometricFlux ~/.julia/packages/GeometricFlux/XpzlC/src/layers/graph_conv.jl:297
 [15] propagate
    @ ~/.julia/packages/GeometricFlux/XpzlC/src/layers/gn.jl:173 [inlined]
 [16] GATConv
    @ ~/.julia/packages/GeometricFlux/XpzlC/src/layers/graph_conv.jl:348 [inlined]
 [17] (::WithGraph{GATConv{Float32, Matrix{Float32}, Vector{Float32}, typeof(identity)}, NamedTuple{(:N, :E, :es, :nbrs, :xs), Tuple{Int64, Int64, Vector{UInt32}, Vector{UInt32}, Vector{UInt32}}}, Nothing})(args::Matrix{Float64})
    @ GeometricFlux ~/.julia/packages/GeometricFlux/XpzlC/src/layers/msgpass.jl:66
 [18] top-level scope
    @ REPL[99]:1

Dynamic Case:

ERROR: AssertionError: a vertex must have self loop (receive a message from itself).
Stacktrace:
 [1] (::GATConv{Float32, Matrix{Float32}, Vector{Float32}, typeof(identity)})(fg::FeaturedGraph{SparseGraph{false, SparseMatrixCSC{Float32, UInt32}, Vector{UInt32}, Int64}, GraphSignals.NodeSignal{Matrix{Float64}}, GraphSignals.NullGraphSignal, GraphSignals.NullGraphSignal, GraphSignals.NullDomain})
   @ GeometricFlux ~/.julia/packages/GeometricFlux/XpzlC/src/layers/graph_conv.jl:338
 [2] top-level scope
   @ REPL[94]:1