2017-fall-DL-training-program / ConvNetwork

2 stars 0 forks source link

Weight Init Gain #4

Open pandasfang opened 7 years ago

pandasfang commented 7 years ago

In the Lab1 implementation detail, it does not specify the weight init "Gain". (in nninit.kaiming_normal(). it defined this parameter) Do we need to setup this para?, and what dose this parameter actually mean when we init a network.

JiaRenChang commented 7 years ago

Hi, The torch.nn.init.kaiming_uniform(tensor, a=0, mode='fan_in') is here. Keep them in default. http://pytorch.org/docs/0.2.0/nn.html#torch-nn-init

This paper indicated the weight initialization problem. https://arxiv.org/pdf/1502.01852.pdf

Jia-Ren

fansia commented 7 years ago

Hi, I have 2 questions: 1) I notice in our lab pdf that weight init is torch.nn.init.kaiming_normal instead of kaiming_uniform. which one is suggested in our lab? 2) when we say weight initialization, do we weight init for the first layer's parameters(conv2d in our lab) or all parameters (weight init for all net)?

Thank you.

JiaRenChang commented 7 years ago

Hi,

  1. Oh, my bad, please use torch.nn.init.kaiming_normal(). The only difference between these two function is sampled from different distribution (Normal vs uniform).

  2. Initialize all the parameter in the network. Jia-Ren

Puff-Wen commented 7 years ago

Hi JiaRen,

When I applied init_params,

net = ResNet20() net.apply(init_params)

I got some run time errors as following.

RuntimeError: bool value of Variable objects containing non-empty torch.cuda.FloatTensor is ambiguous

Would you please help to clarify? Thanks.

JiaRenChang commented 7 years ago

Hi, What is your init_params ? Note that you only need to initialize the weight of convolutional layer.

bcpenggh commented 7 years ago

Hi Puff-Wen,

I am not sure you solved the problem or not. Here is my two cents. I suffer the same problem before I replace line 34 and 41 in utils.py "if m.bias:" with "if m.bias is not None:"

Thanks for Brian's help.

Regards, BC