SciML / Optimization.jl

Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
https://docs.sciml.ai/Optimization/stable/
MIT License
688 stars 75 forks source link

Add trait for checking if OptimizationFunction is used for derivative based optimizers #711

Closed Vaibhavdixit02 closed 2 months ago

Vaibhavdixit02 commented 4 months ago

Describe the bug 🐞

Currently the error thrown when a user has not used OptimizationFunction is something like grad not found. The way to do this would be define a trait like https://github.com/SciML/SciMLBase.jl/blob/0998e074058c99098c673bb3f6f37e30d282ea2f/src/alg_traits.jl#L77-L131, namely a requiresderivative and then in the solver subpackages check for that based on the algorithms, similar to how other traits are used now https://github.com/SciML/Optimization.jl/blob/master/lib/OptimizationOptimJL/src/OptimizationOptimJL.jl#L9-L16.

Expected behavior User is informed that the algorithm choice requires OptimizationFunction

Minimal Reproducible Example 👇

using Optimization
rosenbrock(u, p) = (p[1] - u[1])^2 + p[2] * (u[2] - u[1]^2)^2
u0 = zeros(2)
p = [1.0, 100.0]

prob = OptimizationProblem(rosenbrock, u0, p)

using OptimizationOptimJL
sol = solve(prob, LBFGS())