JuliaSmoothOptimizers / Percival.jl

Implementation of a Augmented Lagrangian method
Other
53 stars 17 forks source link

External solvers support #36

Closed ivborissov closed 3 years ago

ivborissov commented 4 years ago

Hi, thanks for the great package!

Do you have plans to add support for suboptimizers other than JSOSolvers? Or maybe you can give guidaunce how to do it ? My interest is to use Augmented Lagrangian method with different (gradient based and derivative-free optimizers from different packages)

abelsiqueira commented 4 years ago

Hi, @ivborissov, thanks for using it. We have plans to allow other subsolvers internally, but they must be "JSO-compliant", a term that I haven't defined officially yet (paper in development), but can be heard in this vides: https://www.youtube.com/watch?v=9tykQTSuyDk

Essentially, it means the input must be an AbstractNLPModel from NLPModels.jl and the output must be a GenericExecutionStats from SolverTools.jl, some keyword arguments will be expected as well for clean swapping of methods, but I'm writing some constrained solvers before deciding first. Notice that all our solvers are JSO-compliant (JSOSolvers, CaNNOLeS, Percival itself), as well as our wrappers (NLPModelsIpopt and NLPModelsKnitro).

If you're willing to help/make a pull request, the proposed work is:

PS. We are also developing a derivative-free package here: https://github.com/JuliaSmoothOptimizers/DerivativeFreeSolvers.jl, maybe you find it useful. They are supposed to be JSO-compliant too, but I haven't touched them in a while.

ivborissov commented 4 years ago

Abel, thanks a lot for clarification! I will try derivative-free solvers with Percival. However Auglag is not completely derivative-free approach due to gradient projection method. Do you know if there is a possible way not to use gradients in Auglag ? And one more question regarding bound constraints. Am I right that only tron from JSOSolvers support them? So in case of other solvers it seems the best I can do is to put bounds as inequality constraints ( like x->x[i]-bound[i]).

abelsiqueira commented 4 years ago

Actually, you're gonna have to manually modify quite a few places to get a derivative-free augmented Lagrangian method. Our stationarity measure in Percival uses the gradient, so it can't be avoided without changing a lot of internals. The method would need a different kind of stationarity control. Initially, I imagine you could just ignore the test and run for a limited number of iterations, but for a complete study, see for instance:

M.A. Diniz-Ehrhardt, J.M. Martínez, L.G. Pedroso. Derivative-free methods for nonlinear programming with general lower-level constraints. Computational & Applied Mathematics, v. 30, p. 19-52, 2011, 10.1590/S1807-03022011000100003

and related papers. (I'm trying to get @pedrosolucas to adopt Julia, so this may be it).

Regarding bounds, tron is the only pure-julia solver we have that currently handles it. We had a student implementing an interior-point method to compare both of them, but it was not completed. Ipopt and Knitro can solve bounded problems and can be accessed through NLPModelsIpopt and NLPModelsKnitro. Unfortunately passed the bounds as inequalities won't be enough, because we convert these to inequalities in Percival. Handling inequalities directly is a future work.

ivborissov commented 4 years ago

I see. Thanks for the detailed explanation and for the link!