Closed beomyeol closed 8 years ago
@beomyeol Sorry for the late pass. I've left some comments for you. The sample dataset works well on both groupcomm and paramserver configurations.
@jsjason Thank you for your review. I've addressed your comments and left a comment for the type of reduce values. Please take another look :)
@beomyeol I've done a second pass and left two comments. Thanks!
@jsjason I've addressed your comments. Please review the changes.
Looks great, will merge.
This pull request introduces batch processing for deep neural networks. A batch size that is used for training can be specified in the neural network configuration file. The changes involved in this PR are the followings.
push
forParameterProvider
is changed. The size of batch should be given.generateParameterGradient
is the element-wise sum of parameter gradient matrices for each input instance in a batch.This closes #141.