YichengDWu / Sophon.jl

Efficient, Accurate, and Streamlined Training of Physics-Informed Neural Networks
https://yichengdwu.github.io/Sophon.jl/dev/
MIT License
54 stars 5 forks source link

Can't reproduce ODE Tutorial: FullyConnected does not work #217

Closed arthur-bizzi closed 11 months ago

arthur-bizzi commented 11 months ago

First of all: thanks for the fantastic package. I specially appreciate how much cleaner the syntax is as compared to NeuralPDE.

Following the ODE tutorial (Lotka-Volterra), this line throws an error:

pinn = PINN(x = FullyConnected(1, 1, sin; hidden_dims=8, num_layers=3),
            y = FullyConnected(1, 1, sin; hidden_dims=8, num_layers=3))
ERROR: MethodError: no method matching Dense(::Pair{Int64, Int64}, ::typeof(sin); init_weight::Sophon.var"#75#76"{typeof(sin)}, init_bias::typeof(Lux.zeros32), allow_fast_activation::Bool)

Closest candidates are:
  Dense(::Pair{<:Int64, <:Int64}, ::Any; init_weight, init_bias, use_bias, bias) got unsupported keyword argument "allow_fast_activation"
   @ Lux C:\Users\55619\.julia\packages\Lux\s0bDu\src\layers\basic.jl:177
  Dense(::Pair{<:Int64, <:Int64}) got unsupported keyword arguments "init_weight", "init_bias", "allow_fast_activation"
   @ Lux C:\Users\55619\.julia\packages\Lux\s0bDu\src\layers\basic.jl:177

Stacktrace:
 [1] kwerr(::NamedTuple{(:init_weight, :init_bias, :allow_fast_activation), Tuple{Sophon.var"#75#76"{typeof(sin)}, typeof(Lux.zeros32), Bool}}, ::Type, ::Pair{Int64, Int64}, ::Function)
   @ Base .\error.jl:165
 [2] macro expansion
   @ C:\Users\55619\.julia\packages\Sophon\hXYvs\src\layers\nets.jl:0 [inlined]
 [3] FullyConnected(layer_sizes::NTuple{5, Int64}, activation::typeof(sin), ::Val{true}; init_weight::Sophon.var"#75#76"{typeof(sin)}, init_bias::typeof(Lux.zeros32), allow_fast_activation::Bool)
   @ Sophon C:\Users\55619\.julia\packages\Sophon\hXYvs\src\layers\nets.jl:317
 [4] FullyConnected(in_dims::Int64, out_dims::Int64, activation::typeof(sin); hidden_dims::Int64, num_layers::Int64, outermost::Bool, init_weight::Function, init_bias::Function, allow_fast_activation::Bool)
   @ Sophon C:\Users\55619\.julia\packages\Sophon\hXYvs\src\layers\nets.jl:312
 [5] top-level scope
   @ Untitled-3:43

If I replace it with a similar Lux Chain, the script runs well:

n = 8
l = 1
lays = [Dense(n=>n,sin) for _ in 1:l]
# Constructing the physics-informed neural network (PINN)
pinn = PINN(x = Chain(Dense(1=>n,sin), lays..., Dense(n=>1)),
            y = Chain(Dense(1=>n,sin), lays..., Dense(n=>1)))

... image

YichengDWu commented 11 months ago

What is the version of Lux.jl in your environment?

arthur-bizzi commented 11 months ago

Currently, 0.4.37. Pkg won't let me upgrade to 0.5 due to compatibility issues with Sophon v.0.4.3.

YichengDWu commented 11 months ago

I will tag a new version compatible with Lux 0.5.

YichengDWu commented 11 months ago

Could you try it with the latest version?

arthur-bizzi commented 11 months ago

Yep, now it works.

YichengDWu commented 11 months ago

Good. Thanks for the feedback!