SciML / Optimization.jl

Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
https://docs.sciml.ai/Optimization/stable/
MIT License
730 stars 83 forks source link

Parameter Optimization of a MethodOfLines Discretized PDE with Newest ModelingToolkit Version #851

Closed leeandrepc closed 3 weeks ago

leeandrepc commented 3 weeks ago

I've been trying to update a model that used to work (I think pre-ModelingToolkit v9.xx.x) to the most recent version of ModelingToolkit v9.49.0. However, when I try to do the parameter optimization part of model it errors and I cannot figure out how to resolve it.

Minimal Reproducible Example 👇

using ModelingToolkit, MethodOfLines, DifferentialEquations, Plots, Statistics
import ModelingToolkit: Interval
using SciMLStructures: replace!, Tunable, replace

gauss_ic(x) =  5.0 * exp(-1/2* (x - 0.5)^2/0.5^2)

@parameters t, x, A
@variables c(..)#, Φᵢ(..), Φ(..)
@register_symbolic gauss_ic(x)
Dt = Differential(t)
Dx = Differential(x)
Dxx = Differential(x)^2

eqs = [Dt(c(t, x)) ~ A * Dxx(c(t, x))]

domain = [t ∈ Interval(0.0, 1.0), 
    x ∈ Interval(0.0, 1.0)]

bcs = [ c(0., x) ~ gauss_ic(x),  
    Dx(c(t, 0.)) ~ 0.0, 
    Dx(c(t, 1.)) ~ 0.0 ]

@named pdesys = PDESystem(eqs, bcs, domain, [t, x], [c(t, x)], [A]; defaults =  Dict(A => 0.01))
dx = 0.01
discretization = MOLFiniteDifference([x => dx], t)
prob = discretize(pdesys, discretization)
sol = DifferentialEquations.solve(prob, Tsit5())
data = sol.u[c(t, x)]
t_ = 0:1/(size(data)[1]-1):1
x_ = 0:1/(size(data)[2]-1):1

function mse_loss(x, p)
    prob_ = p
    replace!(Tunable(), prob_.p, x)
    newsol = DifferentialEquations.solve(prob_, Tsit5())
    return mean(abs2.(newsol(t_, x_)[1] .- data))
end

using Optimization, OptimizationOptimJL, OptimizationOptimisers
optfunc = Optimization.OptimizationFunction(mse_loss, Optimization.AutoForwardDiff())
optprob = Optimization.OptimizationProblem(optfunc, [0.1], prob)
opt_sol = Optimization.solve(optprob, ADAM(), maxiters = 100)

At the Optimization.solve line, it will error out. I can use AutoFiniteDiff, and that will run and output the expected value for A. However the other suggested AD types in documentation will error with various different errors.

I can paste the whole error and stacktrace output if needed, however im running into character limit issues, so for now just have the truncated version here. Error & Stacktrace ⚠️

ERROR: MethodError: no method matching Float64(::ForwardDiff.Dual{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{…}, Float64}, Float64, 1})
The type `Float64` exists, but no method is defined for this combination of argument types when trying to construct it.

Closest candidates are:
  (::Type{T})(::Real, ::RoundingMode) where T<:AbstractFloat
   @ Base rounding.jl:265
  (::Type{T})(::T) where T<:Number
   @ Core boot.jl:900
  Float64(::IrrationalConstants.Log2π)
   @ IrrationalConstants ~/.julia/packages/IrrationalConstants/vp5v4/src/macro.jl:112
  ...

Stacktrace:
  [1] convert(::Type{Float64}, x::ForwardDiff.Dual{ForwardDiff.Tag{…}, Float64, 1})
    @ Base ./number.jl:7
  [2] setindex!(A::Memory{Float64}, x::ForwardDiff.Dual{ForwardDiff.Tag{…}, Float64, 1}, i1::Int64)
    @ Base ./genericmemory.jl:234
  [3] unsafe_copyto!(dest::Memory{Float64}, doffs::Int64, src::Memory{ForwardDiff.Dual{…}}, soffs::Int64, n::Int64)
    @ Base ./genericmemory.jl:144
  [4] unsafe_copyto!
    @ ./genericmemory.jl:124 [inlined]
  [5] _copyto_impl!
    @ ./array.jl:308 [inlined]
  [6] copyto!
    @ ./array.jl:294 [inlined]
  [7] copyto!
    @ ./array.jl:319 [inlined]
  [8] replace!
    @ ~/.julia/packages/ModelingToolkit/eiNg3/src/systems/parameter_buffer.jl:294 [inlined]
  [9] mse_loss(x::Vector{…}, p::ODEProblem{…})
    @ Main ~/OneDrive/School/Lab/Canadian Light Source/data/IMaging/cls/para_est_MWE.jl:38
 [10] (::OptimizationForwardDiffExt.var"#37#55"{…})(::Vector{…})
    @ OptimizationForwardDiffExt ~/.julia/packages/OptimizationBase/KIIy3/ext/OptimizationForwardDiffExt.jl:98
 [11] #39
    @ ~/.julia/packages/OptimizationBase/KIIy3/ext/OptimizationForwardDiffExt.jl:102 [inlined]
 [12] vector_mode_dual_eval!
    @ ~/.julia/packages/ForwardDiff/onEFt/src/apiutils.jl:24 [inlined]
 [13] vector_mode_gradient!(result::Vector{…}, f::OptimizationForwardDiffExt.var"#39#57"{…}, x::Vector{…}, cfg::ForwardDiff.GradientConfig{…})
    @ ForwardDiff ~/.julia/packages/ForwardDiff/onEFt/src/gradient.jl:98
 [14] gradient!
    @ ~/.julia/packages/ForwardDiff/onEFt/src/gradient.jl:39 [inlined]
 [15] (::OptimizationForwardDiffExt.var"#38#56"{…})(::Vector{…}, ::Vector{…})
    @ OptimizationForwardDiffExt ~/.julia/packages/OptimizationBase/KIIy3/ext/OptimizationForwardDiffExt.jl:102
 [16] macro expansion
    @ ~/.julia/packages/OptimizationOptimisers/AOkbT/src/OptimizationOptimisers.jl:68 [inlined]
 [17] macro expansion
    @ ~/.julia/packages/Optimization/4ueet/src/utils.jl:32 [inlined]
 [18] __solve(cache::OptimizationCache{…})
    @ OptimizationOptimisers ~/.julia/packages/OptimizationOptimisers/AOkbT/src/OptimizationOptimisers.jl:66
 [19] solve!(cache::OptimizationCache{…})
    @ SciMLBase ~/.julia/packages/SciMLBase/hJh6T/src/solve.jl:186
 [20] solve(::OptimizationProblem{…}, ::Optimisers.Adam; kwargs::@Kwargs{…})
    @ SciMLBase ~/.julia/packages/SciMLBase/hJh6T/src/solve.jl:94
 [21] top-level scope
    @ ~/OneDrive/School/Lab/Canadian Light Source/data/IMaging/cls/para_est_MWE.jl:46
Some type information was truncated. Use `show(err)` to see complete types.

Environment (please complete the following information):

  [b0b7db55] ComponentArrays v0.15.17
  [a93c6f00] DataFrames v1.7.0
  [82cc6244] DataInterpolations v6.5.2
  [0c46a032] DifferentialEquations v7.14.0
  [5b8099bc] DomainSets v0.7.14
  [6a86dc24] FiniteDiff v2.26.0
  [26cc04aa] FiniteDifferences v0.12.32
  [033835bb] JLD2 v0.5.7
  [94925ecb] MethodOfLines v0.11.6
  [961ee093] ModelingToolkit v9.49.0
  [3933049c] MultistartOptimization v0.2.2
⌅ [7f7a1694] Optimization v3.28.0
  [e4316d97] OptimizationMultistartOptimization v0.2.0
⌃ [4e6fcdb7] OptimizationNLopt v0.2.2
⌅ [36348300] OptimizationOptimJL v0.3.2
⌅ [42dfb2eb] OptimizationOptimisers v0.2.1
⌃ [500b13db] OptimizationPolyalgorithms v0.2.1
  [1dea7af3] OrdinaryDiffEq v6.89.0
  [91a5bcdd] Plots v1.40.8
  [29dad682] RegularizationTools v0.6.0
  [37e2e3b7] ReverseDiff v1.15.3
  [53ae85a6] SciMLStructures v1.5.0
  [10745b16] Statistics v1.11.1
  [2efcf032] SymbolicIndexingInterface v0.3.34
julia> versioninfo()
Julia Version 1.11.1
Commit 8f5b7ca12ad (2024-10-16 10:53 UTC)
Build Info:
  Official https://julialang.org/ release
Platform Info:
  OS: Linux (x86_64-linux-gnu)
  CPU: 16 × AMD Ryzen 7 6800H with Radeon Graphics
  WORD_SIZE: 64
  LLVM: libLLVM-16.0.6 (ORCJIT, znver3)
Threads: 16 default, 0 interactive, 8 GC (on 16 virtual cores)
Environment:
  JULIA_EDITOR = code
Vaibhavdixit02 commented 3 weeks ago

This error seems to be coming from the replace! call @AayushSabharwal

AayushSabharwal commented 3 weeks ago

You can't use replace! if the types of your parameters change. Here they could be Float64 or ForwardDiff.Dual. replace! being in-place won't work with this. replace is what you want.

ChrisRackauckas commented 3 weeks ago

@leeandrepc is there a tutorial you got this from? It would be good to update the docs there.

leeandrepc commented 3 weeks ago

Thanks for the response and guidance. I have been able to get the MWE to work, im assuming for now that my actual model will work too.

My main reference when trying to update the model and debug myself was: Optimizing through an ODE solve and re-creating MTK Problems Although it says on the page multiple times that replace! is for situations without type changes, it was never clear to me what situations would have changes in type fed to replace!, so I never made the connection that that was where my error stemmed from; and I see now on a third read that there is specifically an example that uses AutoFiniteDiff() rather than AutoForwardDiff() because of the type change. So it is probably a me problem.

Secondary references were the Defining OptimizationProblem, and OptimizationFunction pages when trying to debug.