Delayed objective calculation in the SGD and DCD reasoners.
Saves non-optimizing passes through the data.
Objective change used as stopping criterion is normalized by the number of terms.
Future functionality may modify number of ground terms.
Removal of learning rate as an objective term instance variable.
Saves memory as the learning rate is common accross potentials and now managed by the reasoner.
Added option for setting the learning schedule for SGD inference.
Added option for taking coordinate updates during SGD steps.
Implementation of adagrad and adam in the SGD reasoner.
Improves convergence of inference.
@sriramiscool Can you review this code for correctness?
@cfpryor After Sriram is done, can you review this code for style, consistency, and conventions?
Delayed objective calculation in the SGD and DCD reasoners. Saves non-optimizing passes through the data.
Objective change used as stopping criterion is normalized by the number of terms. Future functionality may modify number of ground terms.
Removal of learning rate as an objective term instance variable. Saves memory as the learning rate is common accross potentials and now managed by the reasoner.
Added option for setting the learning schedule for SGD inference. Added option for taking coordinate updates during SGD steps.
Implementation of adagrad and adam in the SGD reasoner. Improves convergence of inference.