Bartzi / stn-ocr

Code for the paper STN-OCR: A single Neural Network for Text Detection and Text Recognition
https://arxiv.org/abs/1707.08831
GNU General Public License v3.0
499 stars 137 forks source link

loss and accuracy was not normal when trained with 30 epochs #3

Closed yangxiuwu closed 7 years ago

yangxiuwu commented 7 years ago

plot I used generated/centered dataset, and I confused about why there was no validation information in the log file.

Bartzi commented 7 years ago

hmm, in order to help you I might need the log output that was generated during your training, because the plot directly depends on the log output, as we have seen in your last issue. :wink:

yangxiuwu commented 7 years ago

log.gz thanks, here is the log.

Bartzi commented 7 years ago

Well that is interesting, if I'm using the plot_log.py script in the folder mxnet/utils without any changes in your log file I get the following output:

plot

I did the following: python plot_log.py <path to log file> -d plot.png. So the validation information is clearly in the log file. The first validation occurence is in line 406 of your file.

Did you do the same as I did?

yangxiuwu commented 7 years ago

I'm very sorry for my careless, I'm not check the log carefully. when I debug the script with newest version of mxnet there was an error with plot_log.py, so I comment the line of axe.plot(x_test, [test_iterations[iteration][metric] for iteration in x_test], 'g.-', label='test'). now it can plot out the validation curve. So the only question is why the accuracy was decline after some epochs.

Bartzi commented 7 years ago

Alright, well I'm not 100% sure why this happens, but my guess is that the learning rate was too high, s.th. like this usually happens because of that... so it could be that the network jumped out of the local minimum/saddle point it was currently in and was not able to find a way back...

yangxiuwu commented 7 years ago

Your guess is right. I adjust learning rate from 1e-5 to 1e-6, then the accuracy trend line become normal. Thanks for your help.