Open FantDing opened 6 years ago
Maybe try decreasing the learning rate ? Or using a different optimiser
@cipri-tom Thanks, I will have a try
@cipri-tom What's the final value of total loss
approximately
Mine goes down 0.08 in the final iterations.
But since I posted the comment I also trained on some dataset where the loss had spikes like yours. I couldn't identify the problem, but the training completed fine
@FantDing recheck the data, there may be some annotation error
@eragonruan Thank you! The high loss may caused by the model, which I remove lots of layers from. Now, I fixed VGG16 layers' parameter. It takes about 0.5s per iter to train, which is more faster than before. However, the total loss
is about 0.2
which is not very ideal.
@FantDing i meet the same question ,can you tell me how to fixe VGG16 layers' parameter? thanks
Hi! When I train the model, because of the sufficient graphic memory, I change the model like this
However, the loss shaked violently
how to solve that? Thank you