google / jaxopt

Hardware accelerated, batchable and differentiable optimizers in JAX.
https://jaxopt.github.io
Apache License 2.0
939 stars 66 forks source link

Enable warm-starting the hessian approximation in L-BFGS #351

Closed zaccharieramzi closed 1 year ago

zaccharieramzi commented 1 year ago

Currently one can only provide an initial estimate of the solution, enable warm start of the iterates. But for quasi-Newton methods, it can also be a good idea to provide initial estimates of the hessian approximation, typically when solving multiple time a similar problem.

This was for example done in HOAG by @fabianp (see https://github.com/fabianp/hoag/blob/master/hoag/hoag.py#L109).

I am willing to implement this in the next few weeks.

As I know it is of interest to them as well, cc-ing @marius311 and @mblondel

mblondel commented 1 year ago

That would be useful but we need to think of an API to do it. Maybe with a named tuple:

init = LBFGSInit(params=params, grad=grad, hessian=hessian)
res = lbfgs.run(init, *args, **kwargs)
fabianp commented 1 year ago

indeed, I think this would be very useful!