Sense-X / Co-DETR

[ICCV 2023] DETRs with Collaborative Hybrid Assignments Training
MIT License
968 stars 105 forks source link

how to freeze the backbone #96

Closed Feobi1999 closed 9 months ago

Feobi1999 commented 9 months ago

I want to train on my own dataset and I want to freeze the backbone for saving time. The config I use is: model = dict( backbone=dict( _delete_=True, type='SwinTransformerV1', embed_dim=192, depths=[2, 2, 18, 2], num_heads=[6, 12, 24, 48], out_indices=(0, 1, 2, 3), window_size=12, ape=False, drop_path_rate=0.3, patch_norm=True, use_checkpoint=True, pretrained=pretrained, **frozen_stages=3,** ), But I get the error like image

Feobi1999 commented 9 months ago

When I set the find_unused_parameters = True, there will be another error bug

TempleX98 commented 9 months ago

Can you show me your training config here?

Feobi1999 commented 9 months ago

Can you show me your training config here?

I just modify the backbone part with a frozen parameter. image

TempleX98 commented 9 months ago

I use the config projects/configs/co_dino/co_dino_5scale_swin_large_1x_coco.py with the backbone parameter frozen_stages=3 and it works. Please set use_checkpoint to False in the backbone config.

Feobi1999 commented 9 months ago

Do I need to set the find_unused_parameters = True?

I use the config projects/configs/co_dino/co_dino_5scale_swin_large_1x_coco.py with the backbone parameter frozen_stages=3 and it works. Please set use_checkpoint to False in the backbone config.

TempleX98 commented 9 months ago

Do I need to set the find_unused_parameters = True?

I use the config projects/configs/co_dino/co_dino_5scale_swin_large_1x_coco.py with the backbone parameter frozen_stages=3 and it works. Please set use_checkpoint to False in the backbone config.

No

Feobi1999 commented 9 months ago

Oh, I set use_checkpoint to false and it works, thanks so much