CR-Gjx / LeakGAN

The codes of paper "Long Text Generation via Adversarial Training with Leaked Information" on AAAI 2018. Text generation using GAN and Hierarchical Reinforcement Learning.
https://arxiv.org/abs/1709.08624
576 stars 180 forks source link

why there is some 'tail tokens' after the period? #11

Open shaomai00 opened 6 years ago

shaomai00 commented 6 years ago

I have trained the code in /Image CoCo ,during the training , I found there always be some 'tail tokens' after the period and some paddings, even the 451th epoch still has this kind of problem. Like this sentence: 'A large bottle of wine sitting on a table . sauerkraut ' or 'A bathroom with a tub , sink and mirror .(pad)(pad)(pad)across ' I'm wondering why does this happen? please help, thank you very much.

CR-Gjx commented 6 years ago

Temperature, I change the temperature in softmax function to encourage exploration when to train. So many other tokens will be explored and its probability is also increased. I'm doing some work to fix it.

shaomai00 commented 6 years ago

But it's strange that I added the temperature in SeqGan the same as you did in LeakGan, but didn't see this problem happened...