schuy1er / EWF_official

An official code for "Endpoints Weight Fusion for Class Incremental Semantic Segmentation"
MIT License
25 stars 4 forks source link

The result of incremental class is 0. #5

Closed Zzsf11 closed 1 month ago

Zzsf11 commented 2 months ago

Hi, I find this work to be both elegant and easy to understand, so I've attempted to apply it to my dataset. However, I've noticed that the incremental class values are close to zero. I also tried to reproduce the VOC results and encountered a similar issue.

I'm quite confused by this and would appreciate any explanations or solutions you might offer.

My dataset result: Untitled

VOC result: _cgi-bin_mmwebwx-bin_webwxgetmsgimg__ MsgID=4984237477748255634 skey=@crypt_14c47ce3_0ad5c62d377a7cffb408e60604fa7eb7 mmweb_appid=wx_webfilehelper

schuy1er commented 2 months ago

Hi, have you tried to reproduce PLOP? How is its performance?

Zzsf11 commented 2 months ago

Untitled (1) yes, But I only try it on my own dataset. It seems to have similar problem.

schuy1er commented 2 months ago

If PLOP does not perform well on this dataset, then EWF may also have difficulty in achieving good results on new classes, because EWF is used in conjunction with distillation (EWF emphasizes more on alleviating forgetting, and the improvement for new classes actually appears more in multi-step fine-tuning: less forgetting in the last few stages after training the new class). My suggestion is that you can try using MiB, which is a more relaxed constraint than PLOP. In addition, you can try to adjust the weighted coefficient alpha of EWF for averaging, but this is based on the premise that you can first learn new classes.

schuy1er commented 2 months ago

Plus, there is a bug that may cause performance differences. You can try to modify it, I hope it can help you. https://github.com/zhangchbin/RCIL/commit/83ec4191b2056b314b5409b701e2e8a0ae0709f6

Zzsf11 commented 2 months ago

Thank for your explain! I have attempted on MiB for my dataset: Untitled (3) The result is ......( so bad ). As I visualize the incremental class, I've noticed that the primary issue involves the correct masks being assigned to the wrong classes. It seems that some incremental classes are very similar to the base classes, which could be causing the poor results. So I think maybe some methods that enhance classify ability could improve that.

schuy1er commented 2 months ago

So it seems that MiB is good at judging new classes? Because I see that the main reason for its poor performance is the forgetting problem. Can you try MiB+EWF? This may help a little, but I can't guarantee it.