Closed Feobi1999 closed 9 months ago
When I set the find_unused_parameters = True, there will be another error
Can you show me your training config here?
Can you show me your training config here?
I just modify the backbone part with a frozen parameter.
I use the config projects/configs/co_dino/co_dino_5scale_swin_large_1x_coco.py
with the backbone parameter frozen_stages=3
and it works. Please set use_checkpoint
to False in the backbone config.
Do I need to set the find_unused_parameters = True?
I use the config
projects/configs/co_dino/co_dino_5scale_swin_large_1x_coco.py
with the backbone parameterfrozen_stages=3
and it works. Please setuse_checkpoint
to False in the backbone config.
Do I need to set the find_unused_parameters = True?
I use the config
projects/configs/co_dino/co_dino_5scale_swin_large_1x_coco.py
with the backbone parameterfrozen_stages=3
and it works. Please setuse_checkpoint
to False in the backbone config.
No
Oh, I set use_checkpoint to false and it works, thanks so much
I want to train on my own dataset and I want to freeze the backbone for saving time. The config I use is:
model = dict( backbone=dict( _delete_=True, type='SwinTransformerV1', embed_dim=192, depths=[2, 2, 18, 2], num_heads=[6, 12, 24, 48], out_indices=(0, 1, 2, 3), window_size=12, ape=False, drop_path_rate=0.3, patch_norm=True, use_checkpoint=True, pretrained=pretrained, **frozen_stages=3,** ),
But I get the error like