In this fork I'm trying to implement the Gauss-Newton and the Levenberg-Marquardt methods for the Optax library.
The primary objective is to provide a flexible Gauss-Newton transformation that offers options for selecting the damping parameter, the solver, and whether to consider the normal equations. Additionally, this transformation enables solving least squares problems by just providing the jvp of the residuals function and can handle compositional problems by specifying the hvp of the outer function.
A simple usage example for the Gauss-Newton optimizer:
The Gauss-Newton transformation could serve as building block for constructing more sophisticated optimization solvers. As an illustration, I have incorporated the trust region algorithm implemented in Jaxopt (algorithm 6.18 in “Introduction to Optimization and Data Fitting”, K. Madsen & H. B. Nielsen) into the scale_by_madsen_trust_region transformation. As a consquence we can seamlessly obtain the Levenberg-Marquardt method by composing it with the Gauss-Newton transformation described earlier.
This is still a draft and will require more time, but feedbacks and suggestions for improvement are greatly appreciated. Please feel free to share your thoughts on the implementation and suggest any enhancements or modifications.
In this fork I'm trying to implement the Gauss-Newton and the Levenberg-Marquardt methods for the Optax library. The primary objective is to provide a flexible Gauss-Newton transformation that offers options for selecting the damping parameter, the solver, and whether to consider the normal equations. Additionally, this transformation enables solving least squares problems by just providing the jvp of the residuals function and can handle compositional problems by specifying the hvp of the outer function.
A simple usage example for the Gauss-Newton optimizer:
The Gauss-Newton transformation could serve as building block for constructing more sophisticated optimization solvers. As an illustration, I have incorporated the trust region algorithm implemented in Jaxopt (algorithm 6.18 in “Introduction to Optimization and Data Fitting”, K. Madsen & H. B. Nielsen) into the scale_by_madsen_trust_region transformation. As a consquence we can seamlessly obtain the Levenberg-Marquardt method by composing it with the Gauss-Newton transformation described earlier.
The previous example becomes:
This is still a draft and will require more time, but feedbacks and suggestions for improvement are greatly appreciated. Please feel free to share your thoughts on the implementation and suggest any enhancements or modifications.