Closed HaoFengyuan closed 1 year ago
Hi @HaoaHaoaHaoaHaoaH
Sorry for the delay. I see different values in the loss if I use different lr
values. Where are you obtaining the training loss values from?
Closing due to inactivity. Feel free to reopen if you see fit.
When fine-tuning the pretrained model using Adam, I found the loss is same at different learning rates in each epoch. I didn't know why. For example, when lr = 1e-5
even when lr = 0