MadNLP / MadNLP.jl

A solver for nonlinear programming
MIT License
164 stars 14 forks source link

LBFGS approximation of the inverse Hessian #39

Closed mohamed82008 closed 1 year ago

mohamed82008 commented 3 years ago

Hi, thanks for this fantastic package. I have a question. I couldn't see any way to not pass the Hessian in and just rely on an LBFGS approximation of the (inverse) Hessian. Is there a way to do this in MadNLP now? If not, is it too much work to add?

frapac commented 3 years ago

I am running into a similar issue. I think we could build something directly in the Hessian callback by wrapping the LBFGS operator implemented in JuliaSmoothOptimizers: https://github.com/JuliaSmoothOptimizers/LinearOperators.jl/blob/master/src/lbfgs.jl It's not direct to implement, but I think it's doable. In my opinion, the main difficulty is passing the LBFGS matrix to the KKT matrix before solving the linear system.

mohamed82008 commented 3 years ago

I think we should use the Schur complement to solve the system. Assume D is the Hessian below.

image

The linear system M \ b can be solved easily if we have an operator x -> D^-1 * x.

sshin23 commented 3 years ago

Thanks for bringing this up @mohamed82008 @frapac. I agree that MadNLP should have a quasi-Newton option. The linear operator implementation for LBFGS in JSO looks great; maybe we can use this.

It seems to me that this may take a good amount of time though. I'll try to implement this over the summer but can't promise anything yet. I'll keep this issue open and update the progress here.

frapac commented 2 years ago

LBFGS has been implemented in this PR: #221 At the end, we do not depend on JSO for the LBFGS implementation, and use instead the compact representation introduced in:

Byrd, Richard H., Jorge Nocedal, and Robert B. Schnabel. "Representations of quasi-Newton matrices and their use in limited memory methods." Mathematical Programming 63, no. 1 (1994): 129-156.

This is similar (with slight differences) to what is currently implemented in Ipopt.

frapac commented 1 year ago

Solved by #221

mohamed82008 commented 1 year ago

Been admiring this work from a distance. Glad it finally made it in 🎉

frapac commented 1 year ago

Thank you for your support! We would love seeing MadNLP integrated in NonConvex.jl in the medium term :)

francis-gagnon commented 1 year ago

Maybe I'm missing something but, should it means that user-defined function with JuMP + MadNLP should work now ?

I'm still getting the Hessian information is needed. error using JuMP 1.11.1 and MadNLP 0.7.0

Related to : #115

edit : I'm not sure that you still receive notification after the issue is closed so : @frapac @sshin23

Thanks

frapac commented 6 months ago

@francis-gagnon indeed, I haven't received your comment before. MadNLP is currently not supporting user-defined operators inside MOI. This is resolved with this new PR: https://github.com/MadNLP/MadNLP.jl/pull/322