clvrai / ACGAN-PyTorch

MIT License
264 stars 62 forks source link

error in the loss #8

Open devraj89 opened 6 years ago

devraj89 commented 6 years ago

Hi

Thanks for publishing the code in Pytorch ! I have a few questions however. [1] for the loss associated with the auxilliary classifier fc you are using NLL Loss but the last layer is Softmax layer. Shouldn't it be LogSoftmax instead of Softmax ?

[2] I am wondering why is the noise in line 201 generated using the class_one_hot vector representation ? Cannot we use simply the noise as generated in Line 196? Did you find any improvements with that specific noise generation ?

Also instead of randomly generating label as in Line 197 can't we use the label that have been sampled from the data loader i.e., Line 177

[3] Also based on the figure given in the main page (the last figure to the right), it is shown that class information i.e., C_class is given to both the latent variable z and before the discriminator D (on X_real and X_fake ) in the training stage. However in the code, it seems to be missing. Can you please clarify why is that?

Please refer to this https://github.com/znxlwm/pytorch-generative-model-collections/blob/master/ACGAN.py

Thank you in advance for the wonderful code.

Arrcil commented 5 years ago

Hi, devraj89, about your problems, do you deal with? and about cifar10 , can you get the 8.6 score result ?

zhangyixing0404 commented 5 years ago

I think this is a bug. Changing NLLloss to CrossEntroyLoss and removes the softmax layer solves it. [1] for the loss associated with the auxilliary classifier fc you are using NLL Loss but the last layer is Softmax layer. Shouldn't it be LogSoftmax instead of Softmax ?