SHI-Labs / OneFormer

OneFormer: One Transformer to Rule Universal Image Segmentation, arxiv 2022 / CVPR 2023
https://praeclarumjj3.github.io/oneformer
MIT License
1.39k stars 129 forks source link

Freezing layers for DiNAT model #111

Open Pari-singh opened 6 months ago

Pari-singh commented 6 months ago

Hi @praeclarumjj3, I trained the DiNAT backbone model for my custom images and got decent results. Now, I want to perform finetuning on those trained weights for some of the internal tasks, where I will have 500 new images on a regular basis. Thus, you understand that combining entire data and retraining is a kill, hence I am looking for a way to be able to finetune the weights on oncoming 500 images. However, I couldn't find a way to freeze layers for DiNAT. The config file (unlike that for resnet) does not have FREEZE option for MODEL.BACKBONE. Can any of you give more info on how to approach this problem.

Thanks