jalola / improved-wgan-pytorch

Improved WGAN in Pytorch
MIT License
439 stars 68 forks source link

Why comment aux_errD_fake loss (aux error of fake data during training D)? #10

Closed gaoming96 closed 5 years ago

gaoming96 commented 5 years ago

Hi Hung, I wonder why you comment line 247,248 in congan_train.py? I suppose in the paper, both of the two losses aux_errD_real and aux_errD_fake is needed.

Is there any reason to comment it? Thanks in advance.

jalola commented 5 years ago

Hi @gaoming96,

I remember that time I was training, I could not train the network (probably due to that loss aux_errD_fake or maybe another bug I don't remember). Then I removed it out of the loss.

The reason why I didn't put aux_errD_fake in the loss simply because I didn't want D to learn from fake output for labelling task.

For example, during training, the output image from G is still not in form yet, e.g. not yet a table, or room, or bridge... (it does not look like a real image) and we tell D this is a table, room or bridge. So this is wrong information for D to learn.

You can try to put it in the loss and let us know the result :) Hung.

gaoming96 commented 5 years ago

Hi Hung, Thank you for your reply. I have tried both of them in my own dataset and the loss without aux_errD_fake when training D gives better visualization. I think your answer makes sense: adding this E(ln(P(label | Xfake))) may harm D if G is not trained so well.

jalola commented 5 years ago

I agree, if G is not trained well, it may harm D a lot. If it is, probably it will give better result.