VCIP-RGBD / DFormer

[ICLR 2024] DFormer: Rethinking RGBD Representation Learning for Semantic Segmentation
https://yinbow.github.io/Projects/DFormer/index.html
MIT License
142 stars 24 forks source link

Fluctuations in training results under the same configuration #27

Open lsm0627 opened 1 month ago

lsm0627 commented 1 month ago

Dear author, thank you for your excellent work. As a beginner, I encountered a problem during the code copying process. I trained with the same configuration, but the results varied between 56.8 and 57.1. Is there a way to ensure consistency in training results? This will enable me to modify your code more effectively.

yinbow commented 1 month ago

Thanks for your attention to our work!

It is suggested that use the fixed random seed in training process and dataloader. You can change line 27 in 'train.sh' from '--no-use_seed' to 'use_seed' and cancel the annotation in line 142-143&187-188 of 'DFormer/utils /dataloader/dataloader.py' to fix the two processes.

However, the training results still cannot be the same every time, and it is normal for the results to fluctuate within a small range. The reported performance in our paper is also obtained by averaging the results of several trainings.