torch / torch7

http://torch.ch
Other
9k stars 2.38k forks source link

Back propgation through a batch normalization layer in model:eval() mode #1169

Closed Naman-ntc closed 6 years ago

Naman-ntc commented 6 years ago

Hi, There is an assert statement which prevents us from backpropogating through a batch normalization error when the model is in eval mode. Is there any reason for this? In pytorch, there is no such restriction. Keeping the statistics of the batchnorm layer fixed while computing gradients should not be an issue in my opinion