Bartzi / see

Code for the AAAI 2018 publication "SEE: Towards Semi-Supervised End-to-End Scene Text Recognition"
GNU General Public License v3.0
574 stars 147 forks source link

Can not get same accuracy described on the paper #53

Closed k0286 closed 5 years ago

k0286 commented 5 years ago

Hello @Bartzi ,

I'm trying to redo the experiments on the SVHN dataset downloaded from your site.

I've trained the model for 10 times using the svhn\generated\centered as dataset, but I can not get the good accuracy.

All of the models can get the high training accuracy (~90%), but the validation accuracy still stuck at 50%~60%.

Did I miss something?

the following show how I train the model.

  1. warm up the localization net.
    python chainer/train_svhn.py datasets/svhn/curriculum.json log/ -g 0 --char-map datasets/svhn/svhn_char_map.json -lr 1e-4 -si 2000 --blank-label 0 -b 128
  2. after 10 epochs, the training accuracy will get to about 40%~50%
  3. re-initial the parameters of the recognition network. python chainer/train_svhn.py datasets/svhn/curriculum.json log/ -g 0 --char-map datasets/svhn/svhn_char_map.json --blank-label 0 -b 128 -e 50 -r ./log/2018-10-19T01\:21\:24.487865_training/model_6000.npz --load-localization -lr 1e-4
Bartzi commented 5 years ago

Hmm, interesting. As far as I can see your training steps look reasonable. Did you have a look at the plotted states of the network? Have a look at the bboxes directory in the log_dir. This should show you the current predictions of the network on a validation sample. This could help to debug the problem.

Other than that: How does your curriculum.json look like?

k0286 commented 5 years ago

Hi @Bartzi , sry for the late reply! I've checked the results in the bboxes. It looks like the localization net works normally. The curriculum.josn is

[
    {
        "train": "/workspace/see/datasets/svhn/generated/centered/train.csv",
        "validation": "/workspace/see/datasets/svhn/generated/centered/valid.csv"
    }
]

and the output from a redo experiments is shown as following:

epoch       iteration   main/loss   main/accuracy  lr          fast_validation/main/loss  fast_validation/main/accuracy  validation/main/loss  validation/main/accuracy
0           100         1.25307     0.357578       3.08566e-05                                                                                                            
0           200         1.24456     0.368125       4.25853e-05                                                                                                            
0           300         1.20153     0.385469       5.09208e-05                                                                                                            
1           400         1.17223     0.400391       5.74294e-05                                                                                                            
1           500         1.11951     0.422812       6.27392e-05                                                                                                            
1           600         1.10798     0.433789       6.71828e-05                                                                                                            
1           700         1.06278     0.453633       7.0964e-05                                                                                                            
2           800         1.03463     0.468633       7.42193e-05                                                                                                            
2           900         0.985931    0.491172       7.70463e-05                                                                                                            
2           1000        0.957746    0.505977       7.95176e-05  1.53418                    0.319311                                                                       
2           1100        0.933869    0.517109       8.16892e-05                                                                                                            
3           1200        0.904873    0.533711       8.36054e-05                                                                                                            
3           1300        0.858611    0.552344       8.53021e-05                                                                                                            
3           1400        0.865824    0.551641       8.68087e-05                                                                                                            
3           1500        0.829821    0.568203       8.81497e-05                                                                                                            
4           1600        0.807737    0.5775         8.93457e-05                                                                                                            
4           1700        0.781786    0.592383       9.04141e-05                                                                                                            
4           1800        0.759315    0.602109       9.13701e-05                                                                                                            
4           1900        0.761582    0.604453       9.22265e-05                                                                                                            
5           2000        0.749461    0.609375       9.29946e-05  1.44547                    0.364784                                                                       
5           2100        0.724327    0.623398       9.36842e-05                                                
5           2200        0.686837    0.638203       9.43037e-05                                                                                                            
5           2300        0.685871    0.639766       9.48608e-05                                                                                                            
6           2400        0.674767    0.64957        9.5362e-05                                                                                                            
6           2500        0.639222    0.662266       9.58132e-05                                                                                                            
6           2600        0.656936    0.654648       9.62197e-05                                                                                                            
6           2700        0.629457    0.668945       9.6586e-05                                                                                                            
7           2800        0.613539    0.674102       9.69162e-05                                                                                                            
7           2900        0.5958      0.685625       9.7214e-05                                                                                                            
7           3000        0.577403    0.69668        9.74827e-05  1.20679                    0.461639                                                                       
7           3100        0.578619    0.691172       9.77252e-05                                                                                                            
8           3200        0.557145    0.704883       9.7944e-05                                                                                                            
8           3300        0.547113    0.709609       9.81416e-05                                                                                                            
8           3400        0.5424      0.714766       9.83201e-05                                                                                                            
8           3500        0.53863     0.713711       9.84812e-05                                                                                                            
9           3600        0.511317    0.728789       9.86268e-05                                                                                                            
9           3700        0.502823    0.733906       9.87584e-05                                                                                                            
9           3800        0.497252    0.735625       9.88773e-05                                                                                                            
9           3900        0.491412    0.73832        9.89847e-05                                                                                                            
10          4000        0.494718    0.735938       9.90818e-05  1.18833                    0.476262                                                                       
10          4100        0.485971    0.740195       9.91696e-05                                                                                                            
10          4200        0.458216    0.752812       9.9249e-05                                                                                                            
11          4300        0.450636    0.75543        9.93207e-05                                                                                                            
11          4400        0.451316    0.75668        9.93856e-05                                                                                                            
11          4500        0.445689    0.7625         9.94443e-05                                                                                                            
11          4600        0.42933     0.769023       9.94973e-05                                                                                                            
12          4700        0.438031    0.763398       9.95453e-05                                                                                                            
12          4800        0.43478     0.767266       9.95887e-05                                                                                                            
12          4900        0.414773    0.774492       9.96279e-05                                                                                                            
12          5000        0.421017    0.776055       9.96634e-05  1.25329                    0.483273                                                                       
13          5100        0.422985    0.77332        9.96955e-05                                                                                                            
13          5200        0.399167    0.783398       9.97245e-05                                                                                                            
13          5300        0.390925    0.78668        9.97508e-05                                                                                                            
13          5400        0.380798    0.793516       9.97745e-05                                                                                                            
14          5500        0.383987    0.792734       9.9796e-05                                                                                                            
14          5600        0.377676    0.793984       9.98155e-05                                                                                                            
14          5700        0.36625     0.798086       9.9833e-05                                                                                                            
14          5800        0.366481    0.795742       9.98489e-05                                                                                                            
15          5900        0.350913    0.80875        9.98633e-05                                                                                                            
15          6000        0.339742    0.81457        9.98764e-05  1.09449                    0.540865                                                                       
15          6100        0.356839    0.803398       9.98881e-05                                                                                                            
15          6200        0.342831    0.811797       9.98988e-05                                                                                                            
16          6300        0.336951    0.81332        9.99084e-05                                                                                                            
16          6400        0.322984    0.822656       9.99172e-05                                                                                                            
16          6500        0.3313      0.818164       9.9925e-05                                                                                                            
16          6600        0.310146    0.829297       9.99322e-05                                        
...
...
...
40          15900       0.0861549   0.949766       0.0001                                                                                                                
40          16000       0.0834428   0.952695       0.0001      1.32723                    0.574519                                                                       
41          16100       0.0828832   0.952969       0.0001                                                                                                                
41          16200       0.0846093   0.952383       0.0001                                                                                                                
41          16300       0.0850615   0.95082        0.0001                                                                                                                
41          16400       0.0887739   0.948633       0.0001                                                                                                                
42          16500       0.0759526   0.956445       0.0001                                                                                                                
42          16600       0.0785966   0.954961       0.0001                                                                                                                
42          16700       0.0911997   0.947383       0.0001                                                                                                                
43          16800       0.0754143   0.957383       0.0001                                                                                                                
43          16900       0.0712908   0.960664       0.0001
43          16800       0.0754143   0.957383       0.0001                                                                                                                
43          16900       0.0712908   0.960664       0.0001                                                                                                                
43          17000       0.0672408   0.962109       0.0001      1.39767                    0.578225                                                                       
43          17100       0.0687641   0.961328       0.0001                                                                                                                
44          17200       0.0808661   0.953086       0.0001                                                                                                                
44          17300       0.071505    0.959609       0.0001                                                                                                                
44          17400       0.0728209   0.959219       0.0001                                                                                                                
44          17500       0.0615548   0.965898       0.0001                                                                                                                
45          17600       0.0684327   0.960898       0.0001                                                                                                                
45          17700       0.066399    0.962266       0.0001                                                                                                                
45          17800       0.0613631   0.96543        0.0001                                                                                                                
45          17900       0.0632529   0.964453       0.0001                                                                                                                
46          18000       0.0724716   0.959102       0.0001      1.36872                    0.581631                                                                       
46          18100       0.0663616   0.962148       0.0001                                                                                                                
46          18200       0.0552568   0.97           0.0001                                                                                                                
46          18300       0.0634862   0.963672       0.0001                                                                                                                
47          18400       0.0654915   0.964844       0.0001                                                                                                                
47          18500       0.0606005   0.965313       0.0001                                                                                                                
47          18600       0.0631603   0.965313       0.0001                                                                                                                
47          18700       0.0571044   0.967773       0.0001                                                                                                                
48          18800       0.0661906   0.962227       0.0001                                                                                                                
48          18900       0.0627057   0.963164       0.0001                                                                                                                
48          19000       0.0683961   0.961719       0.0001      1.35981                    0.571715

You can see that training accuracy keeps rising but the validation accuracy is stuck. I've redo this experiment for several times, and the results are similar. I don't know what I did wrong.

Bartzi commented 5 years ago

Hmm, it does not look too wrong. On this dataset I had similar results. It seems to be highly overfitting. You can try to run the evaluation script and add --save-rois. You can then inspect the results on multiple images. It might not work everywhere, but that is not such a huge problem, as we created this dataset only to do some early experiments.

k0286 commented 5 years ago

Ok , thank you. But I'm interesting why the model is overfitting if the training set and validation set are both from the same dataset. Anyway ,thx for your reply!

Bartzi commented 5 years ago

I think that creating more training data should close the gap between train and validation accuracy, but we never tried it, because we got the results we wanted and went on to other experiments :wink:

Bartzi commented 5 years ago

You forgot to perform step 3 of the train preparations .