FangShancheng / conv-ensemble-str

The code for “Attention and Language Ensemble for Scene Text Recognition with Convolutional Sequence Modeling”
Apache License 2.0
43 stars 8 forks source link

Performance between Beam search and Greedy search #5

Closed netpcvnn closed 4 years ago

netpcvnn commented 5 years ago

Hello Have you evaluated the performance between the beam search and greedy search?

FangShancheng commented 5 years ago

We had conducted experiments between the beam search (beam width=5) and greedy search (beam width=1) previously and found there is no obvious improvement between the beam width (maybe slightly improved accuracy) in our method. Besides, the greedy search is much faster than beam search with beam width=5.

netpcvnn commented 5 years ago

Thank you for your information. Have you tried to train or infer with batch_size that is different from 1?

FangShancheng commented 5 years ago

The batch_size (both training and inferring) can be greater than 1. See train_eval.py and config.py for example.

netpcvnn commented 5 years ago

Thank you. what about batch_size in Demo.py, When I change in image = tf.placeholder(tf.uint8, (1, 32, 100, 3), name='image') to other that different from 1 for batch_size, I receive the error.

FangShancheng commented 5 years ago

The demo.py also supports a bigger batch size as train_eval.py. You can customize the demo.py to support bigger batch size by modifying the batch_size of not only the tf.placeholder but also the raw_image, or other codes according to the reported error.

netpcvnn commented 5 years ago

Thanks for your answer. When I modified the config.py to beam_width=0 to force not using beam_search and run train_eval.py, I got the error in decoder_conv.py in dynamic_decode function. Look likes Beam search doesn't support batching or something like this.

FangShancheng commented 5 years ago

Thanks for your answer. When I modified the config.py to beam_width=0 to force not using beam_search and run train_eval.py, I got the error in decoder_conv.py in dynamic_decode function. Look likes Beam search doesn't support batching or something like this.

beam_width=1 equals to greedy search.

netpcvnn commented 5 years ago

With my understanding, when beam_width=1, the code in datasets .py (line 44-45) still uses batch_size = 1 for evaluation. Is it correct?

FangShancheng commented 5 years ago

yes, beam search doesn't support batch_size > 1 in our code.

netpcvnn commented 5 years ago

Thank you. Do you have any ways to support batch_size > 1 in your code?