I changed network size from 2 layers to 4 layers and I changed batch size to 1, now after about 30 epochs I see pelpxerity starting to increase, which is, I think, unexpected.
First about 25 epochs it decreases, but then it starts behaving weirdly, and resulting model is worse than before 25 epochs. Is this a bug?
Here is my tensorboard plot. As you can see at the botton graphs, perlexilty is fluctuating (bottom right plot). Why is that happening?
I changed network size from 2 layers to 4 layers and I changed batch size to 1, now after about 30 epochs I see pelpxerity starting to increase, which is, I think, unexpected.
First about 25 epochs it decreases, but then it starts behaving weirdly, and resulting model is worse than before 25 epochs. Is this a bug?
Here is my tensorboard plot. As you can see at the botton graphs, perlexilty is fluctuating (bottom right plot). Why is that happening?