da03 / Attention-OCR

Visual Attention based OCR
MIT License
1.12k stars 363 forks source link

Magic line for visualising attention #69

Open cipri-tom opened 6 years ago

cipri-tom commented 6 years ago

Hello,

Thank you for releasing your code ! It is a great contribution and I see it helps a lot of people (including me!).

I have a question WRT the visualize_attention() method, namely this line: https://github.com/da03/Attention-OCR/blob/88cff37cad09fd85b5178662130595821d7fe0a2/src/model/model.py#L447

I believe its purpose is to highlight the important part and drive the rest to zero, similar to a softmax. I'm curious as to how you got the numbers for the filter. Any insight would be greatly appreciated !

Thanks !