SciML / DiffEqFlux.jl

Pre-built implicit layer architectures with O(1) backprop, GPUs, and stiff+non-stiff DE solvers, demonstrating scientific machine learning (SciML) and physics-informed machine learning methods
https://docs.sciml.ai/DiffEqFlux/stable
MIT License
870 stars 157 forks source link

TensorLayer not callable #879

Closed stephans3 closed 11 months ago

stephans3 commented 11 months ago

Hi all, I tried to use TensorLayer's and got an error message.

The TensorLayer is described here in the docs: https://docs.sciml.ai/DiffEqFlux/stable/examples/tensor_layer/.

Minimal Reproducible Example

using DiffEqFlux, Lux
A = [LegendreBasis(10), LegendreBasis(10)]
nn = TensorLayer(A, 1)
pinit = rand(100);
x0 = ones(2);
nn(x0, pinit) # This does not work

Error & Stacktrace ⚠️

ERROR: MethodError: no method matching (::Chain{NamedTuple{(:layer_1, :layer_2), Tuple{WrappedFunction{DiffEqFlux.var"#53#55"{Vector{TensorProductBasisFunction{typeof(DiffEqFlux.__legendre_poly), Int64}}}}, Dense{false, typeof(identity), typeof(randn), typeof(zeros32)}}}, Nothing})(::Vector{Float64}, ::Vector{Float64})

Closest candidates are:
  (::Chain)(::Any, ::Any, ::NamedTuple)
   @ Lux ~/.julia/packages/Lux/hlo4t/src/layers/containers.jl:478

Stacktrace:
 [1] top-level scope
   @ ~path/to/file/demo_pinn.jl:9

I guess Lux does not have to be imported/used explicitly. This error should not be a problem in Lux but in DiffEqFlux.

Environment (please complete the following information):

DiffEqFlux v3.0.0
Lux v0.5.10
Julia Version 1.9.4
Commit 8e5136fa297 (2023-11-14 08:46 UTC)
Build Info:
  Official https://julialang.org/ release
Platform Info:
  OS: Linux (x86_64-linux-gnu)
  CPU: 8 × Intel(R) Core(TM) i5-8350U CPU @ 1.70GHz
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-14.0.6 (ORCJIT, skylake)
  Threads: 8 on 8 virtual cores
avik-pal commented 11 months ago

The docs are currently not built for v3. See https://docs.sciml.ai/DiffEqFlux/dev/examples/tensor_layer/