sshan-zhao / GASDA

Geometry-Aware Symmetric Domain Adaptation for Monocular Depth Estimation, CVPR 2019
133 stars 23 forks source link

About freeze batch norm #15

Open fyhfly opened 3 years ago

fyhfly commented 3 years ago

Hi, sorry for bothering you. I've been wondering for a long time why did you freeze bn when training GASDA using the pretrained F_s, F_t and CycleGAN. If frozen batch norm, what parameters will be optimized in training?

sshan-zhao commented 3 years ago

Hi, sorry for bothering you. I've been wondering for a long time why did you freeze bn when training GASDA using the pretrained F_s, F_t and CycleGAN. If frozen batch norm, what parameters will be optimized in training?

  1. Firstly we can reduce the required memory. Secondly, we use small batchsize when training GASDA. Thirdly, when training GASDA, all parts have been pre-trained, so the BN layer can be fixed.
  2. Parameters in convolutional layers.
fyhfly commented 3 years ago

Hi, sorry for bothering you. I've been wondering for a long time why did you freeze bn when training GASDA using the pretrained F_s, F_t and CycleGAN. If frozen batch norm, what parameters will be optimized in training?

  1. Firstly we can reduce the required memory. Secondly, we use small batchsize when training GASDA. Thirdly, when training GASDA, all parts have been pre-trained, so the BN layer can be fixed.
  2. Parameters in convolutional layers.

Parameters in convolutional layers are weights and biases. If frozen bn, in my view, weights and biases will not change any more, and performance of net will not improve. When running gasda_model.py, 2 task nets(depth generate), 2 trans nets(cycle), 4 discriminators are trained. You freeze bn in 2 task nets, does that mean these 2 nets won't be optimized? That's what I don't understand.

sshan-zhao commented 3 years ago

Parameters in convolutional layers are weights and biases. If frozen bn, in my view, weights and biases will not change any more——why? The parameters in bn are fixed, but the parameters in conv could be changed. 2021年2月4日 +1100 AM2:39 sshan-zhao/GASDA reply@reply.github.com,写道:

Parameters in convolutional layers are weights and biases. If frozen bn, in my view, weights and biases will not change any more,