Closed kilanny closed 5 years ago
@ibraheemalkilanny You may reduce the sequence length and test again:)
@MaybeShewill-CV Reducing SEQ_LENGTH to 14 or 15 or 16 giving me enf
cost
Sample from no.csv
idx,label
1,24438184568785
2,54105872404867
3,84064371245548
4,74583093168537
5,98424321583283
6,28376215059437
7,53971777788694
8,33491558037994
9,12195351881743
10,78040480271009
11,12466541365433
12,85041576740458
13,43116916958650
14,46899200313839
15,23540081557455
16,74543767222738
17,92751256256102
18,63114905756239
19,10102667018211
20,09362298900141
21,47182254377035
22,93132088505022
23,95059362370598
24,06675732053885
25,38298112821601
@ibraheemalkilanny 1. Make sure you have correctly organized your training data which is supposed to share the same format as the Synth90k dataset. 2. Reduce the sequence when testing:)
I am training on dataset of numbers only 0-9 labels. Label is always 14 digits. After training for 20,000 epochs I get 36 digit label. e.g. gt 54105872404867 got 8050364139709205551580, although already this image was in train data.
Trained using tf version 1.14.0:
I have built a dataset using a variable width-height images in a csv like: image_name,label Then generated the lexicon.txt and train, test, and validate files using
Then generated tfrecords:
Finally decreased epoch count and started training: