Open shaomai00 opened 6 years ago
Temperature, I change the temperature in softmax function to encourage exploration when to train. So many other tokens will be explored and its probability is also increased. I'm doing some work to fix it.
But it's strange that I added the temperature in SeqGan the same as you did in LeakGan, but didn't see this problem happened...
I have trained the code in /Image CoCo ,during the training , I found there always be some 'tail tokens' after the period and some paddings, even the 451th epoch still has this kind of problem. Like this sentence: 'A large bottle of wine sitting on a table . sauerkraut ' or 'A bathroom with a tub , sink and mirror .(pad)(pad)(pad)across ' I'm wondering why does this happen? please help, thank you very much.