aviveise / 2WayNet

16 stars 7 forks source link

The meaning of Params.BN_ACTIVATION #1

Open thkfly opened 7 years ago

thkfly commented 7 years ago

We noticed that “BN_ACTIVATION = False # Controls the order of non-linearity, if True the non-linearity is performed after the BN” in the params.py. However, I suspect that there might be some abuse of the arg for:

nonlinearity=lasagne.nonlinearities.LeakyRectify(
      Params.LEAKINESS) if not Params.BN_ACTIVATION else lasagne.nonlinearities.identity,

on line 171, 172 of file tied_dropout_iterative_model.py and

model.append(BatchNormalizationLayer(model[-1],
         nonlinearity=lasagne.nonlinearities.LeakyRectify(
         Params.LEAKINESS) if Params.BN_ACTIVATION else lasagne.nonlinearities.identity))

on line 179 of the same file

The "Params.BN_ACTIVATION" has opposite behavior

I might thought you force open lasagne.nonlinearities.LeakyRectify as active layer

aviveise commented 7 years ago

I Agree the code is not straight forward and can be modified, but the behavior is as following: if BN_ACTIVATION is True then the LeakyReLU will appear after the batch normalization layer which is stated in line 179 if False it appears before the batch normalization layer, and thus the BN layer needs the identity activation.