drckf / paysage

Unsupervised learning and generative models in python/pytorch.
Other
119 stars 25 forks source link

Refactor fit #69

Closed drckf closed 7 years ago

drckf commented 7 years ago

Consolidated some of the functionality in the fit.py module. Primary changes:

  1. Optimizer methods take gradient objects as arguments rather than computing the gradients themselves.
  2. Contrastive divergence, Persistent contrastive divergence, and TAP are implemented as functions in fit.py that compute approximations to the gradient of the negative log-likelihood. These functions are passed to a single StochasticGradientDescent class that performs the fit.

The rest of the changes result from propagating the above, generating docs, or some small things for cleanup (like removing the test.py placeholder).

drckf commented 7 years ago

Yeah, there is more to do. At some point, I think we should merge the Model and TAP_rbm classes because the TAP approximation is just another method for fitting models (i.e., Boltzmann machines). Once the two are merged, every Model could be trained using CD, PCD, or TAP. I figured that this could wait until the TAP methods were fully implemented.

d-rams commented 7 years ago

Yah that is right. I will merge unless there are last minute objections.