flashlight / wav2letter

Facebook AI Research's Automatic Speech Recognition Toolkit
https://github.com/facebookresearch/wav2letter/wiki
Other
6.37k stars 1.01k forks source link

Valid Loss increasing while valid WER decreasing #879

Closed natspan closed 3 years ago

natspan commented 3 years ago

I am training a model and I noticed that while the validation loss is increasing after a while, the WER keeps decreasing. Does that mean that my model is overfitting?

tlikhomanenko commented 3 years ago

Hey! We have observed this behaviour all the time. So at the beginning and most training they both go down, and at some point valid loss starts to increase while WER continues to decrease. This is not related to overfitting, this is related to the metrics we optimize. Loss and WER are correlated, but they are different and minimum loss doesn't correspond to minimum of WER.

natspan commented 3 years ago

@tlikhomanenko thank you for your response. I am wondering when this happens should I stop training? Or continue because the WER is decreasing even though the vaild loss is increasing while the training loss continues to decrease?

tlikhomanenko commented 3 years ago

Continue! as soon as we care about WER and not the loss itself. Loss plots can detect if something really wrong going on, but then look at wer and be sure train and valid wer goes down, then you can continue training.