amzn / convolutional-handwriting-gan

ScrabbleGAN: Semi-Supervised Varying Length Handwritten Text Generation (CVPR20)
https://www.amazon.science/publications/scrabblegan-semi-supervised-varying-length-handwritten-text-generation
MIT License
264 stars 55 forks source link

About gradient balancing #23

Open miranghimire opened 2 years ago

miranghimire commented 2 years ago

There are three backward calls inside gardient balancing between generator loss & OCR loss:

Won't these calls accumulate the gradients during the call of optimizer.step(); I thought our objective here was to simply compute the gardient balancing terms and multiply those to the loss or could you please give overview of what's going on here inside gradient balancing incase I misunderstood something?

sharonFogel commented 2 years ago

I just looked at the code, I think you're right and there should be self.netG.zero_grad() between the first and second backprop. The third one is performed without gradient accumulation just so that the graph won't be retained.