aviveise / 2WayNet

16 stars 7 forks source link

Drop out #2

Open thkfly opened 7 years ago

thkfly commented 7 years ago

I notice that the tied-dropout also works during the test. Why not remove it in testing phase? As what http://rishy.github.io/ml/2016/10/12/dropout-with-theano/ does

aviveise commented 7 years ago

Hi, During test the dropout only re-scale the activation to match the statistics of the training. It does not eliminate activations

thkfly commented 7 years ago

Hi, image

In one block, since the dropout works after BN and activation , it does not affect the activation in the same block. I guess what you care about is that the dropout might affect the activation in next block under full connection layer. However, I hold that deleting dropout layer in testing would make more neurons activated and thus the prediction might be better.

thkfly commented 7 years ago

I hold that once the machine could distinguish outline and texture of leaf individually,it would recognize the leaf better when all the features are exposed.

I also know your code takes LeakyReLU rather than relu. But I do not think dropout would deform the LeakyReLU layer