Closed d-rams closed 7 years ago
I've put some TODO:
comments in the code to highlight a few things.
There are a few places where things are not adequately documented because there is no Returns
statement in the docstring, and it isn't always clear to me what these functions are returning.
There are also some places where the variable names are not informative (e.g., in the MagnetizationBernoulli
class).
This is a massive refactor of the tap code which paves the way for expansion along a couple orthogonal dimensions:
1] Usage with Gaussian, Ising, and other layers 2] Deep TAP models
It also accomplishes the goal of removing the subclass of models.Model called TAP_rbm and moves all of the processing to methods inside the layers, the model object, and the fit method.
After this refactor the tap training works as before with Bernoulli-Bernoulli models, but should be easy to extend to GRBMs.
I have made the temporary simplification of removing options for TAP expansions beyond two terms as well as the constraint satisfaction method for minimizing the Gibbs free energy. These extensions will be revisited later.
After this PR is merged there should be no restriction with using TAP on deep models except for the marginal free energy computation.