Closed dalonlobo closed 5 years ago
You could plot the training loss with matplotlib to see whether it's converging or not. We didn't add it by default because we don't want to have too many dependencies.
But in general, smaller value of loss means better model.
Hi Team,
I'm training the model on a custom dataset. I'm finding it confusing to interpret the various losses displayed during training. For example, does having large negative Training loss is better or should I concentrate on Negative log likelihood? To sum up, how will I know that the model is converging? Following image is from my training process.
Your help is greatly appreciated.