tyshiwo / DRRN_CVPR17

Code for our CVPR'17 paper "Image Super-Resolution via Deep Recursive Residual Network"
226 stars 82 forks source link

how to share weights during training? #3

Closed ChaofWang closed 7 years ago

ChaofWang commented 7 years ago

hello,I'm very interested in your this work. But I have a question, how to share weights during the training , or where the weights are shared in this code?Hope for your reply,Thank you very much!

tyshiwo commented 7 years ago

Hi,

Thanks for your interests on our work. In caffe, you can name the weight or bias of each conv layer. If you want to share the weights among different conv layers, just set the same name when defining those conv layers.

In caffe's prototxt, you can find: layer { name: "conv1_1a" type: "Convolution" bottom: "bn_conv1_1a" top: "conv1_1a" param { name: "RB1_wa" lr_mult: 1.000000 } param { name: "RB1_ba" lr_mult: 0.100000 } convolution_param { num_output: 128 kernel_size: 3 stride: 1 pad: 1 weight_filler { type: "msra" } bias_filler { type: "constant" value: 0 } } } in which there exists 'name' in 'param'.

ChaofWang commented 7 years ago

Wow!Got it!