leitro / attentionMNISTseq2seq

Use encoder-decoder with Bahdanau attention to predict MNIST digits sequence
7 stars 6 forks source link

how to caculate CER #3

Open mayang1 opened 5 years ago

mayang1 commented 5 years ago

how to generate cer_test.log

leitro commented 5 years ago

Hi! Use the pytasas_words.py to generate cer file, but you need to make sure the URLs of the ground truth and prediction files are correctly modified in that file.

mayang1 commented 5 years ago

For my future research, I reproduced your experiment, but only used a small amount of data. The following problems occur when calculating the accuracy rate.

The first one is train_perdict_seq.40.log The second one is ground_truth The third one is cer_train.log

I think it is because the order of the data of the predicted text and the order of the labels are inconsistent? So I modified the code, but the results have not changed. I don't know how to deal with this problem. Can you help me? Thank you very much.

1596130295@qq.com

From: Lei Date: 2019-07-24 05:14 To: littlethunder/attentionMNISTseq2seq CC: mayang1; Author Subject: Re: [littlethunder/attentionMNISTseq2seq] how to caculate CER (#3) Hi! Use the pytasas_words.py to generate cer file, but you need to make sure the URLs of the ground truth and prediction files are correctly modified in that file. — You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

leitro commented 5 years ago

I believe you have already made it work haha. So for the people who have the same problem, please refer to a more detailed answer https://github.com/omni-us/research-seq2seq-HTR/issues/1.

mayang1 commented 5 years ago

So far, I have two questions about the evaluation of the results:

  1. The generated prediction text and the data in the label text are in different order, and it is not known whether there is any influence when calculating the CER. valid_predict_serq.8.log e04-068-00-03,171 incuuded f01-053-06-06,162 wocabuetry b03-109-05-07,191 - b06-053-01-06,184 practice c06-043-00-03,177 an e04-062-07-09,166 one r06-126-03-05,185 helped c06-020-03-06,182 untertien ground_truth.thresh n02-000-00-00,177 DARKNESS n02-000-00-01,177 had n02-000-00-02,177 descended n02-000-00-03,177 like n02-000-00-04,177 a

  2. The code has been modified, but the results of the CER are found to be small and do not meet the best results mentioned in your article 5.01. In the code you share, the CER calculation process is ready-made code, I don't know how to calculate it.

cer_valid .log 0.56144434 0.3129064 0.21466828 0.16654824999999998 0.13386255 0.11146773 0.09625288 0.0844089 0.07435356 0.06895859 0.060859079999999996 0.05581309 0.051558719999999995 0.0470168 0.04370005

I need your help ,Thank you again.

1596130295@qq.com

From: Lei Date: 2019-07-24 19:00 To: littlethunder/attentionMNISTseq2seq CC: mayang1; Author Subject: Re: [littlethunder/attentionMNISTseq2seq] how to caculate CER (#3) I believe you have already made it work haha. So for the people who have the same problem, please refer to a more detailed answer omni-us/research-seq2seq-HTR#1. — You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

leitro commented 5 years ago

Hi! For the first question, the order doesn't matter when calculating the CER and WER. About the second question, if I understand right, the result reported in the paper was 5.01%, i.e. 0.0501, and your best result achieved by your own experiment is the last one "0.04370005", which is better than mine, right? So you are doing better haha:-)

mayang1 commented 5 years ago

Hi! You are a very good scholar. First of all, thank you very much for your answers. Secondly, my training data is limited, so the results are not accurate. I will continue to study. Thank you again for your help.

From: Lei Date: 2019-07-25 01:03 To: littlethunder/attentionMNISTseq2seq CC: mayang1; Author Subject: Re: [littlethunder/attentionMNISTseq2seq] how to caculate CER (#3) Hi! For the first question, the order doesn't matter when calculating the CER and WER. About the second question, if I understand right, the result reported in the paper was 5.01%, i.e. 0.0501, and your best result achieved by your own experiment is the last one "0.04370005", which is better than mine, right? So you are doing better haha:-) ― You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

mayang1 commented 5 years ago

Hi!Sorry to bother you again, I successfully ran the code with your help. But there are still a few problems, (1) the results of the verification set fluctuate, I tried to modify the batch, but the results have not changed, so ask the question (2) By the way, the encoder you are using VGG16_bn or VGG19_bn, Which effect is better?

                                                                                                                                                                                                           Your friend
                                                                                                                                                                                   mayangyang

1596130295@qq.com

From: Lei Date: 2019-07-25 01:03 To: littlethunder/attentionMNISTseq2s CC: mayang1; Author Subject: Re: [littlethunder/attentionMNISTseq2seq] how to caculate CER (#3) Hi! For the first question, the order doesn't matter when calculating the CER and WER. About the second question, if I understand right, the result reported in the paper was 5.01%, i.e. 0.0501, and your best result achieved by your own experiment is the last one "0.04370005", which is better than mine, right? So you are doing better haha:-) ― You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

mayang1 commented 5 years ago

Dear Sir, I am a college student. In order to graduate next year, I have to complete a paper on "handwritten text recognition" in October. At present, I have a little thought. The relevant code structure you provided is clear and concise. I really appreciate it, in order to me. The realization of the idea, I hope to reproduce your code to achieve a better recognition (5%),

I hope that you are happy every day (for thanks, the following is to make you happy)

                                                                                                                    your friend
                                                                                                                    mayangyang 

1596130295@qq.com

发件人: 1596130295@qq.com 发送时间: 2019-08-01 16:43 收件人: littlethunder/attentionMNISTseq2seq; littlethunder/attentionMNISTseq2seq 抄送: Author 主题: Re: Re: [littlethunder/attentionMNISTseq2seq] how to caculate CER (#3) Hi!Sorry to bother you again, I successfully ran the code with your help. But there are still a few problems, (1) the results of the verification set fluctuate, I tried to modify the batch, but the results have not changed, so ask the question (2) By the way, the encoder you are using VGG16_bn or VGG19_bn, Which effect is better?

                                                                                                                                                                                                           Your friend
                                                                                                                                                                                   mayangyang

1596130295@qq.com

From: Lei Date: 2019-07-25 01:03 To: littlethunder/attentionMNISTseq2s CC: mayang1; Author Subject: Re: [littlethunder/attentionMNISTseq2seq] how to caculate CER (#3) Hi! For the first question, the order doesn't matter when calculating the CER and WER. About the second question, if I understand right, the result reported in the paper was 5.01%, i.e. 0.0501, and your best result achieved by your own experiment is the last one "0.04370005", which is better than mine, right? So you are doing better haha:-) ― You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

leitro commented 5 years ago

VGG19_bn is the best! Give it a try, cheers!