Closed ioannisPApapadopoulos closed 1 month ago
You can switch to the LBFGS algorithm implemented in Ipopt using the option hessian_approximation
. E.g.
model = Model(Ipopt.Optimizer)
JuMP.set_optimizer_attribute(model, "hessian_approximation", "limited-memory")
The performance of LBFGS depends on the size of the history used in the approximation (by default, 6). Depending on your application you can increase that number quite significantly (usually the higher the better):
JuMP.set_optimizer_attribute(model, "limited_memory_max_history", 10)
Please refer to Ipopt's documentation for more details.
Perfect, thank you very much @frapac !
Hi! I am trying to recreate some bad behavior I used to have with Ipopt when the Hessian was being approximated by BFGS rather than AD. Currently the AD Hessian works beautifully and I would like to contrast it.
Is there a way to turn off the AD and force Ipopt to use the BFGS default in this wrapper?
Thank you in advance!