fcdl94 / CoMFormer

Official implementation of "CoMFormer: Continual Learning in Semantic and Panoptic Segmentation"
https://arxiv.org/abs/2211.13999
Other
36 stars 3 forks source link

About loss processing in MaskFormerDistillation #2

Closed zhengyuan-xie closed 12 months ago

zhengyuan-xie commented 12 months ago

Hi, thanks for your great work! I'm trying to reproduce the results in paper and there are some problems. I run the ade.sh(task 0) mentioned in the first issue and the function call of MaskFormerDistillation returns an empty dict. It seems that all losses are removed in line 182. What can I do to solve this problem? Is it reasonable to add 'else: ' before 'losses.pop(k)' and modify the weight_dict in task 0(only leave loss_ce, loss_mask and loss_dice)? Looking for your reply!

fcdl94 commented 12 months ago

Hello! Yes, I noted this problem. I made a mistake when loading the code! You should add the else like in the original mask2former code (here: https://github.com/facebookresearch/Mask2Former/blob/9b0651c6c1d5b3af2e6da0589b719c514ec0d69a/mask2former/maskformer_model.py#L211).

I'll report it here for clarity:

      for k in list(losses.keys()):
          if k in self.criterion.weight_dict:
              losses[k] *= self.criterion.weight_dict[k]
         else:
              # remove this loss if not specified in `weight_dict`
              losses.pop(k)

I'll fix it soon. Thank you for pointing it out.

zhengyuan-xie commented 12 months ago

Thanks for your fast reply!