lhoyer / DAFormer

[CVPR22] Official Implementation of DAFormer: Improving Network Architectures and Training Strategies for Domain-Adaptive Semantic Segmentation
Other
466 stars 92 forks source link

About accuracy #23

Closed BUAA-LKG closed 2 years ago

BUAA-LKG commented 2 years ago

Hi, thank you for your wonderful work, but I have a question for you. I would appreciate it if you could answer it.

When I run “python run_experiments.py --config configs/daformer/gta2cs_uda_warm_fdthings_rcs_croppl_a999_daformer_mitb5_s0.py ”, I find the best mIoU is 66.21% in console output. Is this normal? If not, how can I get the mIoU in the paper (68.3%)?

Another problem, when evaluating performance, do you use a teacher network or a student network?

BUAA-LKG commented 2 years ago

Also, can you provide a configuration file for the pretrained weights (211108_1622_gta2cs_daformer_s0_7f24c)? Because I see the decoder head type in the downloaded log is "UniHead", I'm so confused.

lhoyer commented 2 years ago

Hi, thank you for your wonderful work, but I have a question for you. I would appreciate it if you could answer it.

When I run “python run_experiments.py --config configs/daformer/gta2cs_uda_warm_fdthings_rcs_croppl_a999_daformer_mitb5_s0.py ”, I find the best mIoU is 66.21% in console output. Is this normal? If not, how can I get the mIoU in the paper (68.3%)?

Thanks a lot for your interest in our work. Your result is slightly lower than expected. Do you use exactly the same library versions and installation process as described in the README.md? Also, do you use the same GPU (RTX2080Ti)? The results in the paper were obtained by training DAFormer with three different random seeds (0, 1, and 2) and averaging the final performance. I would recommend the same to you as the performances over different random seeds can vary.

Another problem, when evaluating performance, do you use a teacher network or a student network?

The performance is evaluated using the student network following the procedure in DACS.

Also, can you provide a configuration file for the pretrained weights (211108_1622_gta2cs_daformer_s0_7f24c)? Because I see the decoder head type in the downloaded log is "UniHead", I'm so confused.

The checkpoint of 211108_1622_gta2cs_daformer_s0_7f24c is accompanied by the config file used for its training. After the paper submission, I refactored the source code to improve readability. During that, "UniHead" was renamed to "DAFormerHead". As the training of 211108_1622_gta2cs_daformer_s0_7f24c was executed before the refactor, it still uses the old name.

BUAA-LKG commented 2 years ago

Thank you very much for your prompt reply!

Indeed, our graphics card and library versions are not completely consistent with yours. Is this the main reason affecting the performance? (In other words, do I have to build your environment completely according to readme.md to reproduce the results?) However, figure 4 in the paper shows that the performance is robust to random seed.

I'm trying more experiments. Thank you again for your excellent work.

lhoyer commented 2 years ago

Yes, I would recommend using the exact same library versions in order to reproduce the original results. Other library versions sometimes change the behavior of existing functionality slightly so that results are not well reproducible anymore.

BUAA-LKG commented 2 years ago

Yes, I would recommend using the exact same library versions in order to reproduce the original results. Other library versions sometimes change the behavior of existing functionality slightly so that results are not well reproducible anymore.

OK, I will try again.

BUAA-LKG commented 2 years ago

Yes, I would recommend using the exact same library versions in order to reproduce the original results. Other library versions sometimes change the behavior of existing functionality slightly so that results are not well reproducible anymore.

OK, I will try again.

Sorry to bother you again. I want to know should I set drop_path_rate=0 for ema_model? Since I see the the checkpoint of 211108_1622_gta2cs_daformer_s0_7f24c does.

lhoyer commented 2 years ago

You can just use https://github.com/lhoyer/DAFormer/blob/master/configs/daformer/gta2cs_uda_warm_fdthings_rcs_croppl_a999_daformer_mitb5_s0.py. It has the same functionality as the original config. As I said, I did some refactoring. Therefore, some flags in the old config are not compatible with this repository anymore. Also with the refactored code, I'm able to reproduce the results in the paper.

lhoyer commented 2 years ago

For example, the DropPath is disabled in dacs.py by default: https://github.com/lhoyer/DAFormer/blob/8d6e710700ff5e6a053c77bfe384ba44d4672cbe/mmseg/models/uda/dacs.py#L260

BUAA-LKG commented 2 years ago

Yeah, thanks for your quick reply.