SciML / DiffEqFlux.jl

Pre-built implicit layer architectures with O(1) backprop, GPUs, and stiff+non-stiff DE solvers, demonstrating scientific machine learning (SciML) and physics-informed machine learning methods
https://docs.sciml.ai/DiffEqFlux/stable
MIT License
872 stars 157 forks source link

Error on the "Optimization of Ordinary Differential Equations" Tutorial #376

Closed elalaouifaris closed 4 years ago

elalaouifaris commented 4 years ago

Hi,

I just tried this tutorial.

The last optimization step fails with the error at the end of the issue. There is also a typo on Line 22 where prob_ode should be prob if I understood correctly.

I'm using a windows 10 with JuliaPro 1.4.

Looking forward to making this work! Thanks! Faris

BoundsError: attempt to access 34-element Array{Float64,1} at index [101] getindex at array.jl:788 [inlined] diffeq_to_arrays(::ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float64,1},Nothing,ODEProblem{Array{Float64,1},Tuple{Float64,Float64},true,Array{Float64,1},ODEFunction{true,typeof(lotka_volterra!),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem},Tsit5,DiffEqBase.SensitivityInterpolation{Array{Float64,1},Array{Array{Float64,1},1}},DiffEqBase.DEStats}, ::Bool, ::Bool, ::Int64, ::Nothing, ::Float64, ::Nothing, ::Array{Tuple,1}, ::Symbol, ::Nothing) at solution_interface.jl:194 macro expansion at solution_interface.jl:77 [inlined] apply_recipe(::Dict{Symbol,Any}, ::ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float64,1},Nothing,ODEProblem{Array{Float64,1},Tuple{Float64,Float64},true,Array{Float64,1},ODEFunction{true,typeof(lotka_volterra!),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem},Tsit5,DiffEqBase.SensitivityInterpolation{Array{Float64,1},Array{Array{Float64,1},1}},DiffEqBase.DEStats}) at RecipesBase.jl:281 _process_userrecipes!(::Plots.Plot{Plots.GRBackend}, ::Dict{Symbol,Any}, ::Tuple{ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float64,1},Nothing,ODEProblem{Array{Float64,1},Tuple{Float64,Float64},true,Array{Float64,1},ODEFunction{true,typeof(lotka_volterra!),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem},Tsit5,DiffEqBase.SensitivityInterpolation{Array{Float64,1},Array{Array{Float64,1},1}},DiffEqBase.DEStats}}) at user_recipe.jl:35 recipe_pipeline!(::Plots.Plot{Plots.GRBackend}, ::Dict{Symbol,Any}, ::Tuple{ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float64,1},Nothing,ODEProblem{Array{Float64,1},Tuple{Float64,Float64},true,Array{Float64,1},ODEFunction{true,typeof(lotka_volterra!),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem},Tsit5,DiffEqBase.SensitivityInterpolation{Array{Float64,1},Array{Array{Float64,1},1}},DiffEqBase.DEStats}}) at RecipesPipeline.jl:68 _plot!(::Plots.Plot{Plots.GRBackend}, ::Dict{Symbol,Any}, ::Tuple{ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float64,1},Nothing,ODEProblem{Array{Float64,1},Tuple{Float64,Float64},true,Array{Float64,1},ODEFunction{true,typeof(lotka_volterra!),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem},Tsit5,DiffEqBase.SensitivityInterpolation{Array{Float64,1},Array{Array{Float64,1},1}},DiffEqBase.DEStats}}) at plot.jl:167 plot(::ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float64,1},Nothing,ODEProblem{Array{Float64,1},Tuple{Float64,Float64},true,Array{Float64,1},ODEFunction{true,typeof(lotka_volterra!),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem},Tsit5,DiffEqBase.SensitivityInterpolation{Array{Float64,1},Array{Array{Float64,1},1}},DiffEqBase.DEStats}; kw::Base.Iterators.Pairs{Symbol,Tuple{Int64,Int64},Tuple{Symbol},NamedTuple{(:ylim,),Tuple{Tuple{Int64,Int64}}}}) at plot.jl:57 plot at plot.jl:51 [inlined] (::var"#11#12")(::Array{Float64,1}, ::Float64, ::ODESolution{Float64,2,Array{Array{Float64,1},1},Nothing,Nothing,Array{Float64,1},Nothing,ODEProblem{Array{Float64,1},Tuple{Float64,Float64},true,Array{Float64,1},ODEFunction{true,typeof(lotka_volterra!),LinearAlgebra.UniformScaling{Bool},Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing,Nothing},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}},DiffEqBase.StandardODEProblem},Tsit5,DiffEqBase.SensitivityInterpolation{Array{Float64,1},Array{Array{Float64,1},1}},DiffEqBase.DEStats}) at test_DiffEqFlux.jl:37 macro expansion at train.jl:102 [inlined] macro expansion at ProgressLogging.jl:328 [inlined] (::DiffEqFlux.var"#32#37"{var"#11#12",Int64,Bool,Bool,typeof(loss),Array{Float64,1},Zygote.Params})() at train.jl:43 maybe_with_logger(::DiffEqFlux.var"#32#37"{var"#11#12",Int64,Bool,Bool,typeof(loss),Array{Float64,1},Zyg...

ChrisRackauckas commented 4 years ago

Hey, sorry about that. I just updated those tutorials today (just improving the syntax and making things a bit cleaner!) and the library updates were working their way through the system. If you update that should all be good. Let me know if it's not.

elalaouifaris commented 4 years ago

Thanks! The update solved the problem!

On the loss in the tutorial, it is set as the difference between the solve and a flat 1 function. May be its more fun to have it learn the initial solve of the ODE and get the parameters. It's kind of sad to see the curves flatten out !

I can contribute on the tutorials making if you need support on this.

ChrisRackauckas commented 4 years ago

Yeah, we should probably change that first tutorial to being a parameter estimation against data. I just did that for SDEs yesterday: I think it reads a bit better and is more transferable.