Closed WEIZHIHONG720 closed 2 years ago
Hi, When using fine-tuning, how do I freeze the previous layers and only fine-tune the full connection? Thank you ! I use the ipcsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py :
model = dict( backbone=dict( norm_eval=True, bn_frozen=True, bottleneck_mode='ip', pretrained= # noqa: E251 'https://download.openmmlab.com/mmaction/recognition/csn/ipcsn_from_scratch_r152_ig65m_20210617-c4b99d38.pth' # noqa: E501 ), cls_head=dict( num_classes=400, dropout_ratio=0.6, ) )
And how to freeze the previous laye except for the fully connected layer. Thank you !
Hi, weizh, you can set frozen_stages=4 in the backbone config to do so.
frozen_stages=4
Hi, When using fine-tuning, how do I freeze the previous layers and only fine-tune the full connection? Thank you ! I use the ipcsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py :
model = dict( backbone=dict( norm_eval=True, bn_frozen=True, bottleneck_mode='ip', pretrained= # noqa: E251 'https://download.openmmlab.com/mmaction/recognition/csn/ipcsn_from_scratch_r152_ig65m_20210617-c4b99d38.pth' # noqa: E501 ), cls_head=dict( num_classes=400, dropout_ratio=0.6, ) )
And how to freeze the previous laye except for the fully connected layer. Thank you !