Pascalson / Conditional-Seq-GANs

GANs for Conditional Sequence Generation. Tensorflow. Including the code of paper "Improving Conditional Sequence Generative Adversarial Networks by Stepwise Evaluation" IEEE/ACM TASLP, 2019.
MIT License
34 stars 6 forks source link

The StepGAN train loss #4

Open TedYeh opened 4 years ago

TedYeh commented 4 years ago

Hi, I use my own dataset to train the model, the last MLE-Train result is

global step 100000; learning rate 0.2602; step-time 0.06; perplexity 1.60; loss 0.47
  eval: bucket 0 perplexity 1.74; loss 0.55
  eval: bucket 1 perplexity 1.89; loss 0.64
  eval: bucket 2 perplexity 1.77; loss 0.57
  eval: bucket 3 perplexity 2.04; loss 0.71

and then I train StepGAN

global step 1600; learning rate 0.00006000; D lr 0.00006000; step-time 0.95;
perp -1.00890
loss 6.90309
D-loss 1.39027
reward/D_fake_value [0.24418635 0.11148128 0.11218258 0.11246377 0.10557028 0.09683337
 0.08592263 0.07430949 0.10731699 0.08943568 0.10690096 0.08320196
 0.06315169 0.04705423 0.03540983 0.10991684 0.08354869 0.06181738
 0.04607107 0.03532142]

the MLE loss is 0.47, the StepGAN loss is 6.90309 Why are the losses so different?

RAJBAPU commented 4 years ago

Can you please tell me the process to train on my own dataset?