Open Once88 opened 5 years ago
When I train a model, the loss converges quickly after a few steps and almost does not changes later, but the accuracy (mAP) still getting better and better. Can someone help explain this? Thanks a lot.
maybe different weights can get almost same loss when it is small,but different weights can get different map
When I train a model, the loss converges quickly after a few steps and almost does not changes later, but the accuracy (mAP) still getting better and better. Can someone help explain this? Thanks a lot.