Haochen-Wang409 / U2PL

[CVPR'22] Semi-Supervised Semantic Segmentation Using Unreliable Pseudo-Labels
Apache License 2.0
426 stars 59 forks source link

Question #125

Closed Hugo-cell111 closed 1 year ago

Hugo-cell111 commented 1 year ago

Hi! In the code https://github.com/Haochen-Wang409/U2PL/blob/main/train_semi.py#L514, contra_loss needs to be adapted to the distributed setting (dist.all_reduce(contra_loss)), and I wonder what will happen if I delete this line of code? When do I need to write it? Thanks!

Haochen-Wang409 commented 1 year ago

You can delete this line when using one GPU for training.

Hugo-cell111 commented 1 year ago

Could I explain it in this way: the built-in loss functions such as nn.crossentropyloss don't need to use it, but the customized loss functions such as compute_contra_memobank_loss need to use dist.all_reduce in order to gather the tensors from all GPUs?

Haochen-Wang409 commented 1 year ago

Maybe you are right, we did not try it.