snuspl / dolphin

14 stars 2 forks source link

Allow parameter initializer to push parameter updates instead of activations and error gradients. #113

Closed beomyeol closed 9 years ago

beomyeol commented 9 years ago

ParameterProvider has the push method that provides activations and error gradient vectors, and parameter updates for each layer are generated in LocalNeuralNetParameterProvider. Even though the methods to generate parameter updates are different by a type of layer, LocalNeuralNetParameterProvider uses one method for fully connected layer. Thus, in order to support various layer easily, we can generate parameter updates at each layer and push them to parameter provider. Admittedly, sending activations and error gradients can reduce network costs. But, we can do this later.