jihunchoi / recurrent-batch-normalization-pytorch

PyTorch implementation of recurrent batch normalization
243 stars 34 forks source link

eps and momentum parameters #11

Closed gabrer closed 5 years ago

gabrer commented 5 years ago

The initialization function of the SeparatedBatchNorm1d module has two arguments eps and momentum.

def __init__(self, num_features, max_length, eps=1e-5, momentum=0.1,
                 affine=True):

Is this momentum the same we use in the in our optimization algorithm (e.g. SGD), or it's an additional momentum just for the batch normalization process?

I couldn't find any mention about this in the original paper.

cloudygoose commented 5 years ago

@gabrer You can find info in the document of batch_norm http://pytorch.org/docs/master/nn.html#torch.nn.BatchNorm1d

gabrer commented 5 years ago

Thank you! I have already found it! :)