Open wakupoo opened 5 years ago
Me too, both generator and discriminator are decreasing in accuracy. It seems that the link prediction result is a little different from the paper.
I met this problem when training other GANs, my guess is that since the generator and discriminator both evoluted under training process, I can see the GAN successfully learned features from real data, when the accuracy starts to drop, it may means that generator can not dig more useful feature from real data, which means you need to adjust your model(for instance a new loss function) or add more data.
Me too.
Hi guys! I suggest decreasing the learning rate.
I decreased the learning rate to 1e-4 for both the discriminator and the generator, and trained for 100 epochs. The accuracy for G is about 51% and the accuracy for D is about 60%, which is still a little different from the paper. So I am wondering could anyone share the exact parameters to get the 80%+ accuracy? Thanks a lot.
I decreased the learning rate to 1e-4 for both the discriminator and the generator, and trained for 100 epochs. The accuracy for G is about 51% and the accuracy for D is about 60%, which is still a little different from the paper. So I am wondering could anyone share the exact parameters to get the 80%+ accuracy? Thanks a lot.
I set the learning rate to 1e-5 and I succeeded.It can be set to 1e-6 for greater accuracy, but it also requires more iterations than the default config.
I think you can read the paperCFGAN
, it seems to explan the reason of that
I decreased the learning rate to 1e-4 for both the discriminator and the generator, and trained for 100 epochs. The accuracy for G is about 51% and the accuracy for D is about 60%, which is still a little different from the paper. So I am wondering could anyone share the exact parameters to get the 80%+ accuracy? Thanks a lot.
I set the learning rate to 1e-5 and I succeeded.It can be set to 1e-6 for greater accuracy, but it also requires more iterations than the default config.
so how many iterations you succeeded when set the learning rate to 1e-5? by default?
我把lr_gen和lr_dis设成1e-5,batch_size设置成1024,在epoch 30的时候gen的accuracy达到最大0.884,比论文中的accuracy高一些
I met this problem too. The accuracy at the first iteration is about 76%, but after 20 iterations, the accuracy is about 50%. I don't know why. Is this because the data generated by the generator is similar to the true data?( I guess.) If you know reason, would you please share it will me? Thank you!