Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
doesn't actually test a case where prob.f.adtype isa SciMLBase.NoAD and prob.f.grad is a supplied function
(here, prob.f.adtype is AutoSparse{AutoSymbolics, ADTypes.NoSparsityDetector, ADTypes.NoColoringAlgorithm}(AutoSymbolics(), ADTypes.NoSparsityDetector(), ADTypes.NoColoringAlgorithm())
See https://github.com/SciML/Optimization.jl/issues/755, this is a duplicate as I can't reopen that issue.
Looks like this was maybe just a git mistake (?): Commit https://github.com/SciML/Optimization.jl/pull/759/commits/d2aabd51778829734feec9017db9407069c6c8ea has title 'Add box constraints x closed form gradient test' (which would close https://github.com/SciML/Optimization.jl/issues/755 ) , but the commit doesn't actually include any code that does that !
The code added:
https://github.com/SciML/Optimization.jl/blob/d2aabd51778829734feec9017db9407069c6c8ea/lib/OptimizationOptimJL/test/runtests.jl#L173C5-L175C34 where
(https://github.com/SciML/Optimization.jl/blob/d2aabd51778829734feec9017db9407069c6c8ea/lib/OptimizationOptimJL/test/runtests.jl#L164C6-L165C55)
doesn't actually test a case where
prob.f.adtype isa SciMLBase.NoAD
andprob.f.grad
is a supplied function (here,prob.f.adtype
isAutoSparse{AutoSymbolics, ADTypes.NoSparsityDetector, ADTypes.NoColoringAlgorithm}(AutoSymbolics(), ADTypes.NoSparsityDetector(), ADTypes.NoColoringAlgorithm())