SciML / Optimization.jl

Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
https://docs.sciml.ai/Optimization/stable/
MIT License
688 stars 75 forks source link

Maxiters not respected #335

Open MaAl13 opened 1 year ago

MaAl13 commented 1 year ago

Hello,

i have been lately running the docu for parameter optimization of ODEs https://sensitivity.sciml.ai/dev/ode_fitting/optimization_ode/

However, i tried to decrease the maxiters to 5 for multiple algorithms, but none of them respected the maxiters that were set. How can i avoid this problem?

ChrisRackauckas commented 1 year ago

Which algorithms?

MaAl13 commented 1 year ago

PolyOpt and BFGS. If i just set maxiters to 5 they produce more than 5 iterations in my case

ChrisRackauckas commented 1 year ago

Both of those are BFGS from Optim.jl, and that's a known issue with Optim.jl.

MaAl13 commented 1 year ago

Alright, so which package would you then recommend instead?

MaAl13 commented 1 year ago

Also would you prefer DiffEqParamEstim over Optimization.jl for parameter estimation with ODEs?

ChrisRackauckas commented 1 year ago

NLopt.jl tends to act better. PolyOpt should probably change to using that.

Also would you prefer DiffEqParamEstim over Optimization.jl for parameter estimation with ODEs?

That's not a sensible question. DiffEqParamEstim is just a system which automates the generation of loss functions. You still have to choose an optimizer to optimize the loss function with, and the most sensible choice in 2022 would be Optimization.jl