SciML / Optimization.jl

Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
https://docs.sciml.ai/Optimization/stable/
MIT License
688 stars 75 forks source link

Reopen issue 755 : OptimizationOptimJL.Optim.BFGS() missing tests for user-supplied derivatives (grad), upper/lower bounds, and the combination #760

Closed sjdaines closed 1 month ago

sjdaines commented 1 month ago

See https://github.com/SciML/Optimization.jl/issues/755, this is a duplicate as I can't reopen that issue.

Looks like this was maybe just a git mistake (?): Commit https://github.com/SciML/Optimization.jl/pull/759/commits/d2aabd51778829734feec9017db9407069c6c8ea has title 'Add box constraints x closed form gradient test' (which would close https://github.com/SciML/Optimization.jl/issues/755 ) , but the commit doesn't actually include any code that does that !

The code added:

prob = OptimizationProblem(optprob, x0, _p; sense = Optimization.MaxSense, lb = [-1.0, -1.0], ub = [0.8, 0.8])
sol = solve(prob, BFGS())
@test 10 * sol.objective < l1

https://github.com/SciML/Optimization.jl/blob/d2aabd51778829734feec9017db9407069c6c8ea/lib/OptimizationOptimJL/test/runtests.jl#L173C5-L175C34 where

optprob = OptimizationFunction(rosenbrock, Optimization.AutoModelingToolkit(true, false))

(https://github.com/SciML/Optimization.jl/blob/d2aabd51778829734feec9017db9407069c6c8ea/lib/OptimizationOptimJL/test/runtests.jl#L164C6-L165C55)

doesn't actually test a case where prob.f.adtype isa SciMLBase.NoAD and prob.f.grad is a supplied function (here, prob.f.adtype is AutoSparse{AutoSymbolics, ADTypes.NoSparsityDetector, ADTypes.NoColoringAlgorithm}(AutoSymbolics(), ADTypes.NoSparsityDetector(), ADTypes.NoColoringAlgorithm())