SciML / Optimization.jl

Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
https://docs.sciml.ai/Optimization/stable/
MIT License
688 stars 75 forks source link

Don't add a differentiation algorithm to the PRIMA example #720

Closed ChrisRackauckas closed 3 months ago

ChrisRackauckas commented 3 months ago

Fixes https://github.com/SciML/Optimization.jl/issues/719

Vaibhavdixit02 commented 3 months ago

No that's not the issue. This is part of the OptinizationBase merge. I am looking at it. It will be needed by other backends as well.

ChrisRackauckas commented 3 months ago

But why is the PRIMA example using autodiff in the first place? I agree that there's two issues, but this doc example shouldn't be differentiating anything.

Vaibhavdixit02 commented 3 months ago

Because in the constrained case PRIMA expects a different interface for linear and nonlinear constraints

Ω = { x ∈ ℝⁿ | xl ≤ x ≤ xu, Aₑ⋅x = bₑ, Aᵢ⋅x ≤ bᵢ, cₑ(x) = 0, and cᵢ(x) ≤ 0 }

I use autodiff to create this representation from the functional interface

Vaibhavdixit02 commented 3 months ago

Also, in general - we would never error if an AD backed is passed even for derivative free optimizer. We do throw an error when it's the other way around though, and that should get better with https://github.com/SciML/Optimization.jl/pull/715 and corresponding PR in SciMLBase. Though that's not entirely correct yet.

ChrisRackauckas commented 3 months ago

oh it requires knowing the linear operators?

Vaibhavdixit02 commented 3 months ago

Yes