lhoyer / DAFormer

[CVPR22] Official Implementation of DAFormer: Improving Network Architectures and Training Strategies for Domain-Adaptive Semantic Segmentation
Other
466 stars 92 forks source link

Question about loss #26

Closed Yulu-gan closed 2 years ago

Yulu-gan commented 2 years ago

I found in the clean_loss and mix_loss are defined in dacs.py. As the paper said, these two losses will be added and make propagation, but I haven't found the corresponding file or functions to make this. I‘ll be grateful to your help.

Yulu-gan commented 2 years ago

I've solved the question above. Another question confusing me is that I noticed the model have been created by "self.get_model().forward_train(...) / encode_decode(...)" , in dacs.py. Does calling "forward_train" mean to calculate the loss, and make an ema_update? And I have noticed that there are three "forward_train(..)" , does it mean each model is independent?