SciML / ModelingToolkit.jl

An acausal modeling framework for automatically parallelized scientific machine learning (SciML) in Julia. A computer algebra system for integrated symbolics for physics-informed machine learning and automated transformations of differential equations
https://mtk.sciml.ai/dev/
Other
1.41k stars 203 forks source link

Reversediff with in optimization problem #2968

Open mmbosschaert opened 1 month ago

mmbosschaert commented 1 month ago

Question❓

If I change the differentiation method in the optimization example in Optimizing through an ODE solve and re-creating MTK Problems to AutoReverseDiff() I obtain the following error when solving the problem:

julia> sol = solve(optprob, BFGS())
ERROR: ForwardDiffSensitivity assumes the `AbstractArray` interface for `p`. Thus while
DifferentialEquations.jl can support any parameter struct type, usage
with ForwardDiffSensitivity requires that `p` could be a valid
type for being the initial condition `u0` of an array. This means that
many simple types, such as `Tuple`s and `NamedTuple`s, will work as
parameters in normal contexts but will fail during ForwardDiffSensitivity
construction. To work around this issue for complicated cases like nested structs,
look into defining `p` using `AbstractArray` libraries such as RecursiveArrayTools.jl
or ComponentArrays.jl.

Stacktrace:
  [1] _concrete_solve_adjoint(::ODEProblem{…}, ::CompositeAlgorithm{…}, ::ForwardDiffSensitivity{…}, ::Vector{…}, ::ModelingToolkit.MTKParameters{…}, ::SciMLBase.Cha
inRulesOriginator; saveat::StepRangeLen{…}, kwargs::@Kwargs{…})
    @ SciMLSensitivity ~/.julia/packages/SciMLSensitivity/PstNN/src/concrete_solve.jl:772
  [2] _concrete_solve_adjoint(::ODEProblem{…}, ::CompositeAlgorithm{…}, ::Nothing, ::Vector{…}, ::ModelingToolkit.MTKParameters{…}, ::SciMLBase.ChainRulesOriginator;
 verbose::Bool, kwargs::@Kwargs{…})
    @ SciMLSensitivity ~/.julia/packages/SciMLSensitivity/PstNN/src/concrete_solve.jl:270
...

This has to do with changing the parameter values of the ODE problem using Tunable() and remake.

Is there a way to use reverse differentiation in this case?

SebastianM-C commented 1 month ago

As I understand this is still work in progress. See https://github.com/SciML/SciMLSensitivity.jl/pull/1085 for more details.

ChrisRackauckas commented 3 weeks ago

This should be good with todays release which handles the SciMLStructures interface for forward mode. There's no MWE here to double check, so I'll close under the assumption that it's just the ForwardDiffSensitivity handling of MTK as the stack trace alludes to.

Vaibhavdixit02 commented 2 weeks ago

This is still happening, but there seems to be some other errors along the way as well

with the existing example it gets a StackOverflow error

julia> sol = solve(optprob, BFGS())
ERROR: StackOverflowError:
Stacktrace:
     [1] anyeltypedual(::Type{T}, ::Type{Val{counter}}) where {T<:Union{Set, AbstractArray}, counter}
       @ DiffEqBase ~/.julia/packages/DiffEqBase/sCsah/src/forwarddiff.jl:242
--- the last 1 lines are repeated 1 more time ---
     [3] (::Base.MappingRF{typeof(DiffEqBase.anyeltypedual), Base.BottomRF{typeof(DiffEqBase.promote_dual)}})(acc::Type, x::Type)
       @ Base ./reduce.jl:100
     [4] _foldl_impl(op::Base.MappingRF{typeof(DiffEqBase.anyeltypedual), Base.BottomRF{…}}, init::Type, itr::Core.SimpleVector)
       @ Base ./reduce.jl:62
     [5] foldl_impl
       @ ./reduce.jl:48 [inlined]
     [6] mapfoldl_impl
       @ ./reduce.jl:44 [inlined]
     [7] mapfoldl
       @ ./reduce.jl:175 [inlined]
     [8] mapreduce
       @ ./reduce.jl:307 [inlined]
     [9] __anyeltypedual(::Type{ReverseDiff.TrackedReal{Float64, Float64, ReverseDiff.TrackedArray{…}}})
       @ DiffEqBase ~/.julia/packages/DiffEqBase/sCsah/src/forwarddiff.jl:227
    [10] anyeltypedual(::Type{ReverseDiff.TrackedReal{Float64, Float64, ReverseDiff.TrackedArray{…}}}, ::Type{Val{0}})
       @ DiffEqBase ~/.julia/packages/DiffEqBase/sCsah/src/forwarddiff.jl:233
--- the last 1 lines are repeated 1 more time ---
--- the last 11 lines are repeated 3398 more times ---
 [37390] anyeltypedual(x::ReverseDiff.TrackedArray{Float64, Float64, 1, Vector{Float64}, Vector{Float64}}, ::Type{Val{0}})
       @ DiffEqBase ~/.julia/packages/DiffEqBase/sCsah/src/forwarddiff.jl:293
--- the last 1 lines are repeated 1 more time ---
 [37392] anyeltypedual(p::MTKParameters{…}, ::Type{…})
       @ ModelingToolkit ~/.julia/packages/ModelingToolkit/2KZCu/src/systems/parameter_buffer.jl:612
 [37393] promote_u0
       @ ~/.julia/packages/DiffEqBase/sCsah/src/forwarddiff.jl:359 [inlined]
 [37394] get_concrete_problem(prob::ODEProblem{…}, isadapt::Bool; kwargs::@Kwargs{…})
       @ DiffEqBase ~/.julia/packages/DiffEqBase/sCsah/src/solve.jl:1171
 [37395] solve_up(prob::ODEProblem{…}, sensealg::Nothing, u0::Vector{…}, p::MTKParameters{…}, args::CompositeAlgorithm{…}; kwargs::@Kwargs{…})
       @ DiffEqBase ~/.julia/packages/DiffEqBase/sCsah/src/solve.jl:1074
 [37396] solve(prob::ODEProblem{…}, args::CompositeAlgorithm{…}; sensealg::Nothing, u0::Nothing, p::Nothing, wrap::Val{…}, kwargs::@Kwargs{…})
       @ DiffEqBase ~/.julia/packages/DiffEqBase/sCsah/src/solve.jl:1003
 [37397] loss(x::ReverseDiff.TrackedArray{…}, p::Tuple{…})
       @ Main ./REPL[16]:8
 [37398] (::OptimizationReverseDiffExt.var"#51#75"{…})(::ReverseDiff.TrackedArray{…})
       @ OptimizationReverseDiffExt ~/.julia/packages/OptimizationBase/KIIy3/ext/OptimizationReverseDiffExt.jl:161
 [37399] (::OptimizationReverseDiffExt.var"#54#78"{…})(x::ReverseDiff.TrackedArray{…})
       @ OptimizationReverseDiffExt ~/.julia/packages/OptimizationBase/KIIy3/ext/OptimizationReverseDiffExt.jl:175
 [37400] ReverseDiff.GradientTape(f::OptimizationReverseDiffExt.var"#54#78"{…}, input::Vector{…}, cfg::ReverseDiff.GradientConfig{…})
       @ ReverseDiff ~/.julia/packages/ReverseDiff/p1MzG/src/api/tape.jl:199
 [37401] gradient!(result::Vector{…}, f::Function, input::Vector{…}, cfg::ReverseDiff.GradientConfig{…})
       @ ReverseDiff ~/.julia/packages/ReverseDiff/p1MzG/src/api/gradients.jl:41
 [37402] (::OptimizationReverseDiffExt.var"#53#77"{…})(::Vector{…}, ::Vector{…})
       @ OptimizationReverseDiffExt ~/.julia/packages/OptimizationBase/KIIy3/ext/OptimizationReverseDiffExt.jl:174
 [37403] (::OptimizationOptimJL.var"#19#23"{…})(G::Vector{…}, θ::Vector{…})
       @ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/hDX5k/src/OptimizationOptimJL.jl:297
 [37404] value_gradient!!(obj::OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, x::Vector{Float64})
       @ NLSolversBase ~/.julia/packages/NLSolversBase/kavn7/src/interface.jl:82
 [37405] value_gradient!!(bw::Optim.BarrierWrapper{…}, x::Vector{…})
       @ Optim ~/.julia/packages/Optim/ZhuZN/src/multivariate/solvers/constrained/fminbox.jl:81
 [37406] initial_state(method::BFGS{…}, options::Optim.Options{…}, d::Optim.BarrierWrapper{…}, initial_x::Vector{…})
       @ Optim ~/.julia/packages/Optim/ZhuZN/src/multivariate/solvers/first_order/bfgs.jl:94
 [37407] optimize(df::OnceDifferentiable{…}, l::Vector{…}, u::Vector{…}, initial_x::Vector{…}, F::Fminbox{…}, options::Optim.Options{…})
       @ Optim ~/.julia/packages/Optim/ZhuZN/src/multivariate/solvers/constrained/fminbox.jl:322
 [37408] __solve(cache::OptimizationCache{…})
       @ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/hDX5k/src/OptimizationOptimJL.jl:321
 [37409] solve!(cache::OptimizationCache{…})
       @ SciMLBase ~/.julia/packages/SciMLBase/HReyK/src/solve.jl:188
 [37410] solve(::OptimizationProblem{…}, ::BFGS{…}; kwargs::@Kwargs{})
       @ SciMLBase ~/.julia/packages/SciMLBase/HReyK/src/solve.jl:96

with AutoZygote it changes to

ERROR: Need an adjoint for constructor MTKParameters{Vector{Float64}, StaticArraysCore.SizedVector{0, Any, Vector{Any}}, Tuple{}, Tuple{}}. Gradient is of type MTKParameters{Vector{Float64}, StaticArraysCore.SizedVector{0, Any, Vector{Any}}, Tuple{}, Tuple{}}
Stacktrace:
  [1] error(s::String)
    @ Base ./error.jl:35
  [2] (::Zygote.Jnew{…})(Δ::MTKParameters{…})
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/lib/lib.jl:330
  [3] (::Zygote.var"#2210#back#313"{…})(Δ::MTKParameters{…})
    @ Zygote ~/.julia/packages/ZygoteRules/M4xmc/src/adjoint.jl:72
  [4] MTKParameters
    @ ~/.julia/packages/ModelingToolkit/2KZCu/src/systems/parameter_buffer.jl:7 [inlined]
  [5] (::Zygote.Pullback{…})(Δ::MTKParameters{…})
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface2.jl:0
  [6] setfields_object
    @ ~/.julia/packages/ConstructionBase/c2lWA/src/ConstructionBase.jl:195 [inlined]
  [7] (::Zygote.Pullback{…})(Δ::MTKParameters{…})
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface2.jl:0
  [8] setproperties_object
    @ ~/.julia/packages/ConstructionBase/c2lWA/src/ConstructionBase.jl:208 [inlined]
  [9] (::Zygote.Pullback{…})(Δ::MTKParameters{…})
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface2.jl:0
 [10] setproperties
    @ ~/.julia/packages/ConstructionBase/c2lWA/src/ConstructionBase.jl:136 [inlined]
 [11] (::Zygote.Pullback{…})(Δ::MTKParameters{…})
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface2.jl:0
 [12] set
    @ ~/.julia/packages/Setfield/PdKfV/src/lens.jl:122 [inlined]
 [13] replace
    @ ~/.julia/packages/Setfield/PdKfV/src/sugar.jl:197 [inlined]
 [14] (::Zygote.Pullback{…})(Δ::MTKParameters{…})
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface2.jl:0
 [15] loss
    @ ./REPL[16]:4 [inlined]
 [16] (::Zygote.Pullback{Tuple{…}, Tuple{…}})(Δ::Float64)
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface2.jl:0
 [17] #291
    @ ~/.julia/packages/Zygote/nsBv0/src/lib/lib.jl:206 [inlined]
 [18] #2169#back
    @ ~/.julia/packages/ZygoteRules/M4xmc/src/adjoint.jl:72 [inlined]
 [19] OptimizationFunction
    @ ~/.julia/packages/SciMLBase/HReyK/src/scimlfunctions.jl:3812 [inlined]
 [20] #291
    @ ~/.julia/packages/Zygote/nsBv0/src/lib/lib.jl:206 [inlined]
 [21] (::Zygote.var"#2169#back#293"{Zygote.var"#291#292"{Tuple{…}, Zygote.Pullback{…}}})(Δ::Float64)
    @ Zygote ~/.julia/packages/ZygoteRules/M4xmc/src/adjoint.jl:72
 [22] #37
    @ ~/.julia/packages/OptimizationBase/KIIy3/ext/OptimizationZygoteExt.jl:94 [inlined]
 [23] (::Zygote.Pullback{Tuple{…}, Tuple{…}})(Δ::Float64)
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface2.jl:0
 [24] #291
    @ ~/.julia/packages/Zygote/nsBv0/src/lib/lib.jl:206 [inlined]
 [25] #2169#back
    @ ~/.julia/packages/ZygoteRules/M4xmc/src/adjoint.jl:72 [inlined]
 [26] #39
    @ ~/.julia/packages/OptimizationBase/KIIy3/ext/OptimizationZygoteExt.jl:97 [inlined]
 [27] (::Zygote.Pullback{Tuple{…}, Tuple{…}})(Δ::Float64)
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface2.jl:0
 [28] (::Zygote.var"#75#76"{Zygote.Pullback{Tuple{…}, Tuple{…}}})(Δ::Float64)
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface.jl:91
 [29] gradient(f::Function, args::Vector{Float64})
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface.jl:148
 [30] (::OptimizationZygoteExt.var"#38#56"{OptimizationZygoteExt.var"#37#55"{…}})(::Vector{Float64}, ::Vector{Float64})
    @ OptimizationZygoteExt ~/.julia/packages/OptimizationBase/KIIy3/ext/OptimizationZygoteExt.jl:97
 [31] (::OptimizationOptimJL.var"#19#23"{…})(G::Vector{…}, θ::Vector{…})
    @ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/hDX5k/src/OptimizationOptimJL.jl:297
 [32] value_gradient!!(obj::OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, x::Vector{Float64})
    @ NLSolversBase ~/.julia/packages/NLSolversBase/kavn7/src/interface.jl:82
 [33] value_gradient!!(bw::Optim.BarrierWrapper{…}, x::Vector{…})
    @ Optim ~/.julia/packages/Optim/ZhuZN/src/multivariate/solvers/constrained/fminbox.jl:81
 [34] initial_state(method::BFGS{…}, options::Optim.Options{…}, d::Optim.BarrierWrapper{…}, initial_x::Vector{…})
    @ Optim ~/.julia/packages/Optim/ZhuZN/src/multivariate/solvers/first_order/bfgs.jl:94
 [35] optimize(df::OnceDifferentiable{…}, l::Vector{…}, u::Vector{…}, initial_x::Vector{…}, F::Fminbox{…}, options::Optim.Options{…})
    @ Optim ~/.julia/packages/Optim/ZhuZN/src/multivariate/solvers/constrained/fminbox.jl:322
 [36] __solve(cache::OptimizationCache{…})
    @ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/hDX5k/src/OptimizationOptimJL.jl:321
 [37] solve!(cache::OptimizationCache{…})
    @ SciMLBase ~/.julia/packages/SciMLBase/HReyK/src/solve.jl:188
 [38] solve(::OptimizationProblem{…}, ::BFGS{…}; kwargs::@Kwargs{})
    @ SciMLBase ~/.julia/packages/SciMLBase/HReyK/src/solve.jl:96
 [39] solve(::OptimizationProblem{…}, ::BFGS{…})
    @ SciMLBase ~/.julia/packages/SciMLBase/HReyK/src/solve.jl:93
 [40] top-level scope
    @ ~/Optimization.jl/test/minibatch.jl:121
Some type information was truncated. Use `show(err)` to see complete types.

some how changing the sensealg to zygote leads to

julia> sol = solve(optprob, BFGS())
ERROR: Adjoint sensitivity analysis functionality requires being able to solve
a differential equation defined by the parameter struct `p`. Thus while
DifferentialEquations.jl can support any parameter struct type, usage
with adjoint sensitivity analysis requires that `p` could be a valid
type for being the initial condition `u0` of an array. This means that
many simple types, such as `Tuple`s and `NamedTuple`s, will work as
parameters in normal contexts but will fail during adjoint differentiation.
To work around this issue for complicated cases like nested structs, look
into defining `p` using `AbstractArray` libraries such as RecursiveArrayTools.jl
or ComponentArrays.jl so that `p` is an `AbstractArray` with a concrete element type.

Stacktrace:
  [1] _concrete_solve_adjoint(::ODEProblem{…}, ::CompositeAlgorithm{…}, ::InterpolatingAdjoint{…}, ::Vector{…}, ::MTKParameters{…}, ::SciMLBase.ChainRulesOriginator; save_start::Bool, save_end::Bool, saveat::StepRangeLen{…}, save_idxs::Nothing, kwargs::@Kwargs{})
    @ SciMLSensitivity ~/.julia/packages/SciMLSensitivity/se3y4/src/concrete_solve.jl:378
  [2] _solve_adjoint(prob::ODEProblem{…}, sensealg::InterpolatingAdjoint{…}, u0::Vector{…}, p::MTKParameters{…}, originator::SciMLBase.ChainRulesOriginator, args::CompositeAlgorithm{…}; merge_callbacks::Bool, kwargs::@Kwargs{…})
    @ DiffEqBase ~/.julia/packages/DiffEqBase/sCsah/src/solve.jl:1537
  [3] rrule(::typeof(DiffEqBase.solve_up), prob::ODEProblem{…}, sensealg::InterpolatingAdjoint{…}, u0::Vector{…}, p::MTKParameters{…}, args::CompositeAlgorithm{…}; kwargs::@Kwargs{…})
    @ DiffEqBaseChainRulesCoreExt ~/.julia/packages/DiffEqBase/sCsah/ext/DiffEqBaseChainRulesCoreExt.jl:26
  [4] kwcall(::@NamedTuple{…}, ::typeof(ChainRulesCore.rrule), ::Zygote.ZygoteRuleConfig{…}, ::Function, ::ODEProblem{…}, ::InterpolatingAdjoint{…}, ::Vector{…}, ::MTKParameters{…}, ::CompositeAlgorithm{…})
    @ ChainRulesCore ~/.julia/packages/ChainRulesCore/I1EbV/src/rules.jl:140
  [5] chain_rrule_kw
    @ ~/.julia/packages/Zygote/nsBv0/src/compiler/chainrules.jl:235 [inlined]
  [6] macro expansion
    @ ~/.julia/packages/Zygote/nsBv0/src/compiler/interface2.jl:0 [inlined]
  [7] _pullback(::Zygote.Context{…}, ::typeof(Core.kwcall), ::@NamedTuple{…}, ::typeof(DiffEqBase.solve_up), ::ODEProblem{…}, ::InterpolatingAdjoint{…}, ::Vector{…}, ::MTKParameters{…}, ::CompositeAlgorithm{…})
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface2.jl:87
  [8] _apply(::Function, ::Vararg{Any})
    @ Core ./boot.jl:838
  [9] adjoint
    @ ~/.julia/packages/Zygote/nsBv0/src/lib/lib.jl:203 [inlined]
 [10] _pullback
    @ ~/.julia/packages/ZygoteRules/M4xmc/src/adjoint.jl:67 [inlined]
 [11] #solve#51
    @ ~/.julia/packages/DiffEqBase/sCsah/src/solve.jl:1003 [inlined]
 [12] _pullback(::Zygote.Context{…}, ::DiffEqBase.var"##solve#51", ::InterpolatingAdjoint{…}, ::Nothing, ::Nothing, ::Val{…}, ::@Kwargs{…}, ::typeof(solve), ::ODEProblem{…}, ::CompositeAlgorithm{…})
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface2.jl:0
 [13] _apply(::Function, ::Vararg{Any})
    @ Core ./boot.jl:838
 [14] adjoint
    @ ~/.julia/packages/Zygote/nsBv0/src/lib/lib.jl:203 [inlined]
 [15] _pullback
    @ ~/.julia/packages/ZygoteRules/M4xmc/src/adjoint.jl:67 [inlined]
 [16] solve
    @ ~/.julia/packages/DiffEqBase/sCsah/src/solve.jl:993 [inlined]
 [17] _pullback(::Zygote.Context{…}, ::typeof(Core.kwcall), ::@NamedTuple{…}, ::typeof(solve), ::ODEProblem{…}, ::CompositeAlgorithm{…})
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface2.jl:0
 [18] loss
    @ ~/Optimization.jl/test/minibatch.jl:125 [inlined]
 [19] _pullback(::Zygote.Context{…}, ::typeof(loss), ::Vector{…}, ::Tuple{…})
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface2.jl:0
 [20] _apply
    @ ./boot.jl:838 [inlined]
 [21] adjoint
    @ ~/.julia/packages/Zygote/nsBv0/src/lib/lib.jl:203 [inlined]
 [22] _pullback
    @ ~/.julia/packages/ZygoteRules/M4xmc/src/adjoint.jl:67 [inlined]
 [23] OptimizationFunction
    @ ~/.julia/packages/SciMLBase/HReyK/src/scimlfunctions.jl:3812 [inlined]
 [24] _pullback(::Zygote.Context{…}, ::OptimizationFunction{…}, ::Vector{…}, ::Tuple{…})
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface2.jl:0
 [25] _apply(::Function, ::Vararg{Any})
    @ Core ./boot.jl:838
 [26] adjoint
    @ ~/.julia/packages/Zygote/nsBv0/src/lib/lib.jl:203 [inlined]
 [27] _pullback
    @ ~/.julia/packages/ZygoteRules/M4xmc/src/adjoint.jl:67 [inlined]
 [28] #37
    @ ~/.julia/packages/OptimizationBase/KIIy3/ext/OptimizationZygoteExt.jl:94 [inlined]
 [29] _pullback(ctx::Zygote.Context{…}, f::OptimizationZygoteExt.var"#37#55"{…}, args::Vector{…})
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface2.jl:0
 [30] _apply(::Function, ::Vararg{Any})
    @ Core ./boot.jl:838
 [31] adjoint
    @ ~/.julia/packages/Zygote/nsBv0/src/lib/lib.jl:203 [inlined]
 [32] _pullback
    @ ~/.julia/packages/ZygoteRules/M4xmc/src/adjoint.jl:67 [inlined]
 [33] #39
    @ ~/.julia/packages/OptimizationBase/KIIy3/ext/OptimizationZygoteExt.jl:97 [inlined]
 [34] _pullback(ctx::Zygote.Context{…}, f::OptimizationZygoteExt.var"#39#57"{…}, args::Vector{…})
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface2.jl:0
 [35] pullback(f::Function, cx::Zygote.Context{false}, args::Vector{Float64})
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface.jl:90
 [36] pullback
    @ ~/.julia/packages/Zygote/nsBv0/src/compiler/interface.jl:88 [inlined]
 [37] gradient(f::Function, args::Vector{Float64})
    @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface.jl:147
 [38] (::OptimizationZygoteExt.var"#38#56"{OptimizationZygoteExt.var"#37#55"{…}})(::Vector{Float64}, ::Vector{Float64})
    @ OptimizationZygoteExt ~/.julia/packages/OptimizationBase/KIIy3/ext/OptimizationZygoteExt.jl:97
 [39] (::OptimizationOptimJL.var"#19#23"{…})(G::Vector{…}, θ::Vector{…})
    @ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/hDX5k/src/OptimizationOptimJL.jl:297
 [40] value_gradient!!(obj::OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, x::Vector{Float64})
    @ NLSolversBase ~/.julia/packages/NLSolversBase/kavn7/src/interface.jl:82
 [41] value_gradient!!(bw::Optim.BarrierWrapper{…}, x::Vector{…})
    @ Optim ~/.julia/packages/Optim/ZhuZN/src/multivariate/solvers/constrained/fminbox.jl:81
 [42] initial_state(method::BFGS{…}, options::Optim.Options{…}, d::Optim.BarrierWrapper{…}, initial_x::Vector{…})
    @ Optim ~/.julia/packages/Optim/ZhuZN/src/multivariate/solvers/first_order/bfgs.jl:94
 [43] optimize(df::OnceDifferentiable{…}, l::Vector{…}, u::Vector{…}, initial_x::Vector{…}, F::Fminbox{…}, options::Optim.Options{…})
    @ Optim ~/.julia/packages/Optim/ZhuZN/src/multivariate/solvers/constrained/fminbox.jl:322
 [44] __solve(cache::OptimizationCache{…})
    @ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/hDX5k/src/OptimizationOptimJL.jl:321
 [45] solve!(cache::OptimizationCache{…})
    @ SciMLBase ~/.julia/packages/SciMLBase/HReyK/src/solve.jl:188
 [46] solve(::OptimizationProblem{…}, ::BFGS{…}; kwargs::@Kwargs{})
    @ SciMLBase ~/.julia/packages/SciMLBase/HReyK/src/solve.jl:96
 [47] solve(::OptimizationProblem{…}, ::BFGS{…})
    @ SciMLBase ~/.julia/packages/SciMLBase/HReyK/src/solve.jl:93
 [48] top-level scope
    @ ~/Optimization.jl/test/minibatch.jl:137

Also not the failure on master in Optimization.jl https://github.com/SciML/Optimization.jl/actions/runs/10550106197/job/29225903838#step:7:1169 seems to be related, it goes away on using sensealg = InterpolatingAdjoint(; autojacvec=ZygoteVJP()) there

ChrisRackauckas commented 2 weeks ago

ERROR: Need an adjoint for constructor MTKParameters{Vector{Float64}, StaticArraysCore.SizedVector{0, Any, Vector{Any}}, Tuple{}, Tuple{}}. Gradient is of type MTKParameters{Vector{Float64}, StaticArraysCore.SizedVector{0, Any, Vector{Any}}, Tuple{}, Tuple{}}

This constructor already exists? https://github.com/SciML/ModelingToolkit.jl/blob/master/ext/MTKChainRulesCoreExt.jl#L7 Are you on the latest MTK?

Vaibhavdixit02 commented 2 weeks ago

yeah v9.33.1, I guess that's not the big issue the main ones are the last two. Though it probably is an issue for SciMLSensitivity

ChrisRackauckas commented 2 weeks ago

That really looks like an environment issue. It's not seeing that p is a SciMLStructure and it's not finding the dispatch that I am pointing to. Both of those things exist and have tests. So, something might be held back somewhere.

ChrisRackauckas commented 2 weeks ago

There's a few things that needed to be released. Let me try that first 😅

ChrisRackauckas commented 2 weeks ago

I think everything in SciMLSensitivity.jl is updated, so this should be fine. @AayushSabharwal can you check, and can you add reverse mode to our docs/tests in the right spots?

AayushSabharwal commented 1 week ago

ReverseDiff.jl doesn't work because DiffEqBase doesn't like it (anyeltypedual runs into a StackOverflow). Post that, promote_u0 doesn't handle the case when p isa MTKParameters && p.tunable isa ReverseDiff.TrackedArray. I'm working on fixing this.

AayushSabharwal commented 1 week ago

https://github.com/JuliaSymbolics/SymbolicUtils.jl/pull/646 and https://github.com/SciML/DiffEqBase.jl/pull/1078 are necessary to use AutoReverseDiff in the doc example