Open thkfly opened 7 years ago
I Agree the code is not straight forward and can be modified, but the behavior is as following: if BN_ACTIVATION is True then the LeakyReLU will appear after the batch normalization layer which is stated in line 179 if False it appears before the batch normalization layer, and thus the BN layer needs the identity activation.
We noticed that “BN_ACTIVATION = False # Controls the order of non-linearity, if True the non-linearity is performed after the BN” in the params.py. However, I suspect that there might be some abuse of the arg for:
on line 171, 172 of file tied_dropout_iterative_model.py and
on line 179 of the same file
The "Params.BN_ACTIVATION" has opposite behavior
I might thought you force open lasagne.nonlinearities.LeakyRectify as active layer