sherjilozair / char-rnn-tensorflow

Multi-layer Recurrent Neural Networks (LSTM, RNN) for character-level language models in Python using Tensorflow
MIT License
2.64k stars 964 forks source link

Prepelpxeity starts increasing after some time #128

Open advanceddeveloper opened 6 years ago

advanceddeveloper commented 6 years ago

I changed network size from 2 layers to 4 layers and I changed batch size to 1, now after about 30 epochs I see pelpxerity starting to increase, which is, I think, unexpected.

First about 25 epochs it decreases, but then it starts behaving weirdly, and resulting model is worse than before 25 epochs. Is this a bug?

Here is my tensorboard plot. As you can see at the botton graphs, perlexilty is fluctuating (bottom right plot). Why is that happening?

advanceddeveloper commented 6 years ago

*perplexity