JunlinHan / DCLGAN

Code for Dual Contrastive Learning for Unsupervised Image-to-Image Translation, NTIRE, CVPRW 2021, oral.
Other
159 stars 20 forks source link

using self.loss_G to backpropagation both G_A and G_B #6

Open kk2487 opened 2 years ago

kk2487 commented 2 years ago

Hello, in dcl_model.py

why can you use self.loss_G to do backpropagation with both G_A and G_B ? Is there any special way to handle this?

JunlinHan commented 2 years ago

Hi kk2487, Thanks for your questions.

Parameters of G_A and G_B are chained together in the optimizer. loss_G calculates both loss of G_A and G_B. Thus they can be backpropagate together and update the paramerters in one go.

See line 103 (optimizer) and line 202-233 (G_loss) for details.

kk2487 commented 2 years ago

thanks for your response.

I have another question.

loss_G is summed through loss_G_A and loss_G_B. In the original design, Parameters of G_A and G_B are chained together and backpropagate using the same loss (loss_G )

Should G_A backpropagate with loss_G_A and G_B backpropagate with loss_G_B and update the paramerters separately?

JunlinHan commented 2 years ago

Should G_A backpropagate with loss_G_A and G_B backpropagate with loss_G_B and update the paramerters separately?

Yes, the parameters should be updated separately. Here the implementation is actually identical. ( pytorch automatically matches the loss and corresponding parameters).