princeton-vl / RAFT

BSD 3-Clause "New" or "Revised" License
3.23k stars 629 forks source link

Why do you freeze batch norm layer for fine-tuning? #141

Open Seyoung9304 opened 2 years ago

Seyoung9304 commented 2 years ago

Hi, this is a very nice work! Thanks for your contribution.

In your code, I found that you freeze every batch normalization layer when you finetune the model.

if args.stage != 'chairs':
        model.module.freeze_bn()

I wonder why you froze BN layer when you fine tune the model. Is there any theoretical or experimental reason for it?

Thank you in advance :)