yan-hao-tian / VW

iclr2024 poster Varying Window Attention
MIT License
118 stars 19 forks source link

VW-SegFormer v.s. VW-Mask2Former #16

Closed htwang14 closed 5 months ago

htwang14 commented 5 months ago

Dear Authors,

Thank you for sharing the code for your outstanding research.

I noticed in your paper that the experiments for VW-SegFormer were conducted on the COCO and CityScapes datasets, while those for VW-Mask2Former were conducted on ADE20k. This makes it challenging for readers to determine which method offers better performance. Could you share any results or insights comparing these two methods on the same dataset?

The results don't need to be rigorously scientific, but any information would be helpful for me to decide which approach to use as a starting point for my work.

Thank you in advance for your assistance!

yan-hao-tian commented 5 months ago

hellohello,感谢大佬关注,现在代码模型还没全整理好,ade20k上都跑了vw-seg和vw-mask2former啊,论文里应该有吧,vw-mask2former是更好的。 cityscapes上我跑过vw-mask2former,也比vw-seg要好一点。coco我没跑过vw-mask2former,我最近会跑一下,coco跑的时间会长一些。

htwang14 commented 5 months ago

Thank you so much for the fast reply! Sorry I missed part of the results in the paper.