Open devraj89 opened 6 years ago
Hi, devraj89, about your problems, do you deal with? and about cifar10 , can you get the 8.6 score result ?
I think this is a bug. Changing NLLloss to CrossEntroyLoss and removes the softmax layer solves it. [1] for the loss associated with the auxilliary classifier fc you are using NLL Loss but the last layer is Softmax layer. Shouldn't it be LogSoftmax instead of Softmax ?
Hi
Thanks for publishing the code in Pytorch ! I have a few questions however. [1] for the loss associated with the auxilliary classifier fc you are using
NLL Loss
but the last layer isSoftmax
layer. Shouldn't it beLogSoftmax
instead ofSoftmax
?[2] I am wondering why is the noise in
line 201
generated using the class_one_hot vector representation ? Cannot we use simply the noise as generated inLine 196
? Did you find any improvements with that specific noise generation ?Also instead of randomly generating
label
as inLine 197
can't we use thelabel
that have been sampled from the data loader i.e.,Line 177
[3] Also based on the figure given in the main page (the last figure to the right), it is shown that class information i.e.,
C_class
is given to both the latent variablez
and before the discriminatorD
(onX_real
andX_fake
) in the training stage. However in the code, it seems to be missing. Can you please clarify why is that?Please refer to this https://github.com/znxlwm/pytorch-generative-model-collections/blob/master/ACGAN.py
Thank you in advance for the wonderful code.