SciML / DiffEqFlux.jl

Pre-built implicit layer architectures with O(1) backprop, GPUs, and stiff+non-stiff DE solvers, demonstrating scientific machine learning (SciML) and physics-informed machine learning methods
https://docs.sciml.ai/DiffEqFlux/stable
MIT License
871 stars 157 forks source link

MethodError when running with NeuralODE with custom Lux layer #825

Closed RJZS closed 1 year ago

RJZS commented 1 year ago

Hi, I'm trying to run a NeuralODE with a custom layer, LuxNeurLayer, I've defined in Lux. I'm receiving a MethodError when I run prob_neuralode(u0, p, st). The error is:

MethodError: no method matching (::LuxNeurLayer{typeof(Lux.glorot_uniform)})(::Vector{Float32}, ::ComponentVector{Float32,
Vector{Float32}, Tuple{Axis{(gain = ViewAxis(1:40, ShapedAxis((2, 20), NamedTuple())),)}}}, ::NamedTuple{(), Tuple{}})

The closest candidate suggested is (::LuxNeurLayer)(::AbstractMatrix, ::Any, ::NamedTuple) which is odd I think, as it suggests that u0 is the issue (which is a Vector{Float32} as in the tutorial).

Any help would be appreciated! The rest of the code is below.

using Lux, DiffEqFlux, DifferentialEquations, Optimization, OptimizationOptimJL, Random, Plots
struct LuxNeurLayer{F1} <: Lux.AbstractExplicitLayer
    in_dims::Int
    out_dims::Int
    init_gain::F1
end

function LuxNeurLayer(in_dims::Int, out_dims::Int; init_gain=Lux.glorot_uniform)
    return LuxNeurLayer{typeof(init_gain)}(in_dims, out_dims, init_gain)
end
l = LuxNeurLayer(20,4)

function Lux.initialparameters(rng::AbstractRNG, l::LuxNeurLayer)
    return (gain=l.init_gain(rng, Int(l.out_dims/2), l.in_dims),)
end

Lux.initialstates(::AbstractRNG, ::LuxNeurLayer) = NamedTuple()

function (l::LuxNeurLayer)(x::AbstractMatrix, ps, st::NamedTuple)
    y = zeros(l.out_dims)
    for i=1:num_neurs
        Isyns = zeros(l.in_dims)
        for j=1:l.in_dims
            Isyns[j] = ps.gain[i,j] / (1 + exp(-0.4 * (x[2*j] + 0.6) ))
        end
        y[2*i-1] = -y[2*i-1] + 0.5*tanh(y[2*i-1]) + sum(Isyns)
        y[2*i]   = (y[2*i-1] - y[2*i]) / 5.0
    end
    return y, st
end
rng = Random.default_rng()
Random.seed!(rng, 0)

p, st = Lux.setup(rng, l)

prob_neuralode = NeuralODE(l, tspan, Tsit5(), saveat = tsteps)

using ComponentArrays
pinit = ComponentArray(p)
u0 = Float32[2.0; 0.0]

prob_neuralode(u0, pinit, st)
ChrisRackauckas commented 1 year ago

What version?

avik-pal commented 1 year ago

You haven't defined a dispatch for ::AbstractVector, function (l::LuxNeurLayer)(x::AbstractMatrix, ps, st::NamedTuple)

avik-pal commented 1 year ago

Easiest thing to do would be

function (l::LuxNeurLayer)(x::AbstractVector, ps, st::NamedTuple)
    y = reshape(x, :, 1)
   res, st = l(y, ps, st)
   return vec(res), st
end
RJZS commented 1 year ago

Perfect, thank you very much!

(For future reference, this was with Julia v1.7.2, DiffEqFlux v2.0.0 and Lux v0.4.53)