Open MariyaLapaeva opened 2 years ago
Hello, this is kind of a long shot, but it could be because self.loss_D
is still referenced.. Could you delete this attribute before you call self.compute_G_loss()
?
o, great! Thank you very much for the quick reply! It seems that it works without initialisation of D_loss in the first round. The results don't look so good so far compared to lsgan :bowtie: , but I will check around the parameters. lsgan: wgangp:
hi @MariyaLapaeva, did you find what are the optimal parameters? I'm also working with medical data and currently trying to use the wgangp version as well!
Hi @MariyaLapaeva and @danivelikova, have you been able to use the wgangp version? I am also working with medical data and lsgan runs into some kind of mode collapse.
Hello,
Thank you for the implementation of the Wasserstein GANs mode and GP loss! I followed the way proposed here: https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix/issues/439 It works for me for pix2pix and cycleGAN. I specified--gan_mode wgangp and also call the function cal_gradient_penalty. I followed the same instructions with CUT, however, modification of loss function fails for me:
Even if I specify retain_graph=True here
Would you have a suggestion on how to solve the problem for CUT? Thank you!