Closed AvivLugasi closed 2 months ago
fixed a bug in the backward pass. 'inv_var' was calculated as: inv_var = 1 / (self.batch_std 2 + EPSILON) while it is should be: inv_var = 1 / ((self.batch_std 2 + EPSILON) ** 0.5). the network works well now and arrive to coverage faster. closing the issue
added batch norm layer. how ever seems like that for current layer artichature and mnist data, i receive better results without batchnorm. it doesnot indicate that there is an error in the implementation. I reviewed the code and checked both forward pass and back pass according to https://kevinzakka.github.io/2016/09/14/batch_normalization/. issue wont be closed yet