yaringal / ConcreteDropout

Code for Concrete Dropout as presented in https://arxiv.org/abs/1705.07832
MIT License
245 stars 68 forks source link

Confusion about initialization in bigger nets #10

Open koenigpeter opened 5 years ago

koenigpeter commented 5 years ago

Hey, I'm trying out concrete dropout with bigger nets (namely DenseNet121 and ResNet18) and for that tried to port the Keras implementation for spatial concrete dropout to PyTorch. Since it works for DenseNet121 (model converges) but strangely not for ResNet18, I was wondering, if maybe the initialization I used was wrong. For both weight_regularizer and dropout_regularizer I used the initialization given in the MNIST example of the spatial concrete dropout Keras implementation (both dependent by division on the train dataset length). However when looking at the paper, you seem to have used 0.01 x N x H x W for the dropout regularizer when using bigger models, but this multiplication would lead to a much much bigger factor than the 2. / N specified in the example. What kind of initialization is right? I would greatly appreciate if you could clear up my confusion! Cheers!

axel971 commented 3 years ago

Hi ! I agree and I am confuse for the same reasons. I read the paper and I did not understand how the weight regularizer and dropout regularizer are initialized. Could you please tell us what means prior length scale ? and which value to assign to this variable ?

JFagin commented 2 years ago

I am also confused about this