eriklindernoren / PyTorch-GAN

PyTorch implementations of Generative Adversarial Networks.
MIT License
16.22k stars 4.05k forks source link

Why same optimizer is being used to optimize both generators in cycle GAN ? #108

Open asjad18 opened 4 years ago

asjad18 commented 4 years ago

optimizer_G = torch.optim.Adam( itertools.chain(G_AB.parameters(), G_BA.parameters()), lr=opt.lr, betas=(opt.b1, opt.b2) )

I am unable to understand this. Why shouldn't we use different optimizer for both generators ?

AnimationFan commented 3 years ago

just see the paper , the cycle consistentency loss has consist the loss of the two generator , the purpose of the loss it's to make the generator perform better at the same time, so it's right to optimize both generator