Closed dante1024 closed 2 years ago
Hello! I have now set the batchsize to 16 and the epoch of your source code to 25, but I set the epoch to 500, while I train and test, I find that the loss will continue to decrease when the epoch exceeds 25, so I think the model After the epoch exceeds 25, the effect should be better, so I tested it when epoch=126, but the test results are still different from the original paper. I would like to ask, isn't the lower the loss, the better the effect of the model? ?
The learning rate is adjusted by poly strategy, that is, it will drop after each epoch. No matter how many epochs you set, loss will inevitably decrease. The lower the loss does not mean the better, because it may be overfitted. This paper is not a research on how to train or how to set training parameters. Why do you spend so much time adjusting parameters?
Hello! I have now set the batchsize to 16 and the epoch of your source code to 25, but I set the epoch to 500, while I train and test, I find that the loss will continue to decrease when the epoch exceeds 25, so I think the model After the epoch exceeds 25, the effect should be better, so I tested it when epoch=126, but the test results are still different from the original paper. I would like to ask, isn't the lower the loss, the better the effect of the model? ?