zhaozhengChen / ReCAM

The official code of CVPR 2022 paper (Class Re-Activation Maps for Weakly-Supervised Semantic Segmentation).
143 stars 18 forks source link

Generate pseudocode (sema_seg) with all black background images and Deeplab segmentation that only recognizes the background #26

Closed zbb1111 closed 7 months ago

zbb1111 commented 7 months ago

Hello, thank you very much for your excellent article. I have some confusion and I hope you can help me answer it:

  1. The pseudocode (sema_seg XX. png) generated by Recam (10582 images) is all black background images, but the miou in the log is 0.7. Then, I used these 10582 images to replace the SegmentationAugClass (12031) in Deeplab for training, but the final test result was only 0.03, only recognizing the background. I am a beginner, and there may be some areas where I misunderstood. I hope you can answer them for me. Thank you! 微信图片_20240414225053 2 3
zhaozhengChen commented 7 months ago

Thanks for your interest in our work.

  1. As the mIoU is correct, I think the reason is that the label range in the png file is 0-20 (VOC), which is close to 0. Then the color should be close to black.
  2. Is the training loss of deeplab normal (not NaN)? In my experiment, I observed that the loss may not be stable.
zbb1111 commented 7 months ago

Yes,the training loss of deeplab normal is NaN,What is the reason for this? How should I solve it?

zbb1111 commented 7 months ago

In addition,Can I take a look at the result graph of the generated visual pseudocode, sema_seg?

zbb1111 commented 7 months ago

I'm sorry to trouble you with so many questions. I hope you can reply to me when you have time. Sincere thanks!!

I found that Deeplab stopped at 99 iterations during training. Is this normal? I have run it several times and it's like this. Can this be the reason for the loss NAN 微信图片_20240414232621

zhaozhengChen commented 7 months ago
  1. For the Nan problem, in most cases, rerun works. In the Figure you provided, it does not stop at 99 iterations. It seems you got a warning, I think it's not the reason.
  2. For visualization, you can refer to some visualization code like this.
zhaozhengChen commented 7 months ago

The Nan problem always occurs in the first few iterations. You can observe the loss of the first few iterations to judge whether it is running normally.

zbb1111 commented 7 months ago

I have solved this problem, thank you very much!!! Wishing you all the best!