google-deepmind / kfac-jax

Second Order Optimization and Curvature Estimation with K-FAC in JAX.
Apache License 2.0
213 stars 16 forks source link

Add Support for KFAC Optimization in LSTM and GRU Layers #188

Open neuronphysics opened 7 months ago

neuronphysics commented 7 months ago

Feature

I kindly request the addition of support for the Kronecker-Factored Approximate Curvature (KFAC) optimization technique in LSTM and GRU layers within the existing KFAC Optimizer. Currently, most of the KFAC Optimizer classes are tailored for linear and 2D convolution layers. Extending its capabilities to encompass RNN layers would be a significant enhancement.

Proposal

The proposal entails integrating KFAC optimization support for LSTM and GRU layers into the KFAC optimizer. This would involve adapting the KFAC Optimizer to calculate the requisite statistics and computation of chain-structured linear Gaussian graphical model for LSTM and GRU layers which I could not find any public implementation of it.

Motivation

LSTM and GRU layers are foundational components in dealing with sequential data, and time-series analysis. I wonder how much KFAC can significantly improve model training using LSTM and GRU layers by providing accurate approximations of the Fisher information matrix? By integrating support for LSTM and GRU layers within the KFAC Optimizer, researchers would gain the ability to apply the KFAC optimization technique to a wider array of models, including reinforcement learning algorithms.

Additional Context

I have full confidence that the repository maintainers, particularly the first author of the paper titled

I appreciate your consideration of this feature request. Thank you.

james-martens commented 7 months ago

Yeah support for recurrent networks is something we have partially implemented internally. If there's interest I guess we could try to get this out sooner.

neuronphysics commented 7 months ago

Great to hear that support for recurrent networks is implemented. There's definitely interest in this feature, and making it public sooner would be much appreciated, especially for its application in RL models which is my main interest.

neuronphysics commented 2 months ago

Is there any update on publishing the KFAC code for RNNs?

james-martens commented 2 months ago

Sorry, no. Myself and others have been very busy and haven't had time. If you're interested in using a Kronecker-factored method compatible with RNNs out of the box, you could try Shampoo or TNT, which make fewer assumptions about the structure of the network. I imagine that these are implemented in some open source library, but don't know specifically. We might eventually release support for these approaches in kfac_jax, but I have no timeline for that.