leondgarse / Keras_insightface

Insightface Keras implementation
MIT License
230 stars 56 forks source link

Loss and accuracy #104

Closed Abdelfatah1998 closed 1 year ago

Abdelfatah1998 commented 1 year ago

Hi,

I am training a model and the loss goes down and accuracy goes up while training proceeds; however, once I reach to a point of 0.99 accuracy or so, the loss goes up quickly and then starts decreasing all over again. For example, the model below reached to an accuracy of 0.99 and then suddenly went back to accuracy of 0.8. Is there a reason or from where I can edit my training script?

kk

leondgarse commented 1 year ago

It's the learning_rate strategy, that at epoch 18, it restarted from half of the initial value. It's controlled by lr_decay_steps in train.Train, and detail explained in #learning-rate part. You can turn it off by setting lr_decay_steps={your_total_epochs - 1}. Also, you can see that phenomenon in most Training scripts #15, that the accuracy and loss restart at epoch ~18. But mostly the loss shouldn't go that high. What's your loss in the early epochs, like epoch 3?

Abdelfatah1998 commented 1 year ago

Thanks for the detailed explanation, the loss function that I am using is ArcFace with:

optimizer = keras.optimizers.SGD(learning_rate=0.1, momentum=0.9) sch = [ {"loss": losses.ArcfaceLoss(scale=32), "epoch": 1, "optimizer": optimizer}, {"loss": losses.ArcfaceLoss(scale=64), "epoch": 50}, ]

leondgarse commented 1 year ago

I think the loss is normal then, as for SGD + L2 regularrizer, the printed loss value includes L2 loss.

Abdelfatah1998 commented 1 year ago

Noted, thanks for your description and support, much appreciated.