BUTSpeechFIT / EEND

70 stars 10 forks source link

about train.py #4

Closed HaoFengyuan closed 1 year ago

HaoFengyuan commented 2 years ago

When fine-tuning the pretrained model using Adam, I found the loss is same at different learning rates in each epoch. I didn't know why. For example, when lr = 1e-5

[ INFO : 2022-08-30 21:01:58,956 ] - Epoch:   1, LR: 0.0000100, Training Loss: 0.36569
[ INFO : 2022-08-30 21:02:08,486 ] - Epoch:   2, LR: 0.0000100, Training Loss: 0.37642
[ INFO : 2022-08-30 21:02:18,587 ] - Epoch:   3, LR: 0.0000100, Training Loss: 0.37213

even when lr = 0

[ INFO : 2022-08-30 20:53:02,874 ] - Epoch:   1, LR: 0.0000000, Training Loss: 0.36569  
[ INFO : 2022-08-30 20:53:12,830 ] - Epoch:   2, LR: 0.0000000, Training Loss: 0.37642  
[ INFO : 2022-08-30 20:53:23,750 ] - Epoch:   3, LR: 0.0000000, Training Loss: 0.37213
fnlandini commented 2 years ago

Hi @HaoaHaoaHaoaHaoaH Sorry for the delay. I see different values in the loss if I use different lr values. Where are you obtaining the training loss values from?

fnlandini commented 1 year ago

Closing due to inactivity. Feel free to reopen if you see fit.