SciML / Optimization.jl

Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
https://docs.sciml.ai/Optimization/stable/
MIT License
704 stars 77 forks source link

Optim.jl's IPNewton should not require constraints #629

Closed sebapersson closed 9 months ago

sebapersson commented 9 months ago

Interior point Newton methods such as IPNewton or Ipopt are capable of handling non-linear constraints, but it is also possible to run them with just box-constraints. In our benchmarks for ODE models we have found IPNewton to perform well for smaller models. However, currently when trying to use either IPNewton or Ipopt Optimization requests constraints (MVE below).

The problem can be worked around by providing an empty constraint function, but, to make it easier for the user it would be nice if this is not necessary. In particular, we have just wrapped Optmization for PEtab.jl (and it is great to gain access to so many algorithms via one interface!) - and it would be great if the user would not have to provide a flag that they want to use an interior-point method when creating the OptmizationProblem.

MVE

using Optimization, ForwardDiff, OptimizationOptimJL

rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
x0 = zeros(2)
_p = [1.0, 100.0]

f = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
prob = OptimizationProblem(f, x0, _p, lb = [-1.0, -1.0], ub = [0.8, 0.8])
sol = solve(prob, IPNewton())

Error message :

ERROR: The algorithm IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol} requires constraints, pass them with the `cons` kwarg in `OptimizationFunction`.
Stacktrace:
 [1] _check_opt_alg(prob::OptimizationProblem{true, OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, typeof(rosenbrock), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, alg::IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol}; kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
   @ SciMLBase ~/.julia/packages/SciMLBase/LRUtn/src/solve.jl:112
 [2] _check_opt_alg(prob::OptimizationProblem{true, OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, typeof(rosenbrock), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, alg::IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol})
   @ SciMLBase ~/.julia/packages/SciMLBase/LRUtn/src/solve.jl:105
 [3] init(::OptimizationProblem{true, OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, typeof(rosenbrock), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol}; kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
   @ SciMLBase ~/.julia/packages/SciMLBase/LRUtn/src/solve.jl:162
 [4] init(::OptimizationProblem{true, OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, typeof(rosenbrock), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol})
   @ SciMLBase ~/.julia/packages/SciMLBase/LRUtn/src/solve.jl:161
 [5] solve(::OptimizationProblem{true, OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, typeof(rosenbrock), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol}; kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
   @ SciMLBase ~/.julia/packages/SciMLBase/LRUtn/src/solve.jl:94
 [6] solve(::OptimizationProblem{true, OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, typeof(rosenbrock), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Vector{Float64}, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol})
   @ SciMLBase ~/.julia/packages/SciMLBase/LRUtn/src/solve.jl:91
 [7] top-level scope
   @ ~/Dropbox/PhD/Projects/tmp/Optim_fail/Example_crash.jl:9
Vaibhavdixit02 commented 9 months ago

Do you see the error with Ipopt as well?

sebapersson commented 9 months ago

Sorry my fault, with Ipopt I do not see the error, so this code works:

using Optimization, ForwardDiff, OptimizationMOI, Ipopt

rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
x0 = zeros(2)
_p = [1.0, 100.0]

f = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
prob = OptimizationProblem(f, x0, _p, lb = [-1.0, -1.0], ub = [0.8, 0.8])
sol = solve(prob, Ipopt.Optimizer())
Vaibhavdixit02 commented 9 months ago

No worries, wanted to confirm.

ChrisRackauckas commented 9 months ago

Fixed the title