Closed luhc15 closed 6 years ago
When using model.eval(), mean and variance are fixed to pretrained values. BN layer's weights and bias can be fixed using requires_grad=False.
In my model, I have set requres_grad = False for all batch-norm parameters. Thus weights of all BN layers remain fixed. model.eval() is also used, which keeps the running mean and var fixed.
when set model.eval(), the BN layers weights and bias are fixed, but var and mean will changed when finetuned, is there any influences if var and mean change or should set them fixed using momentum=0?