omni-us / research-seq2seq-HTR

MIT License
20 stars 13 forks source link

Attention weights remain constant during training #6

Open solenetarride opened 3 years ago

solenetarride commented 3 years ago

Thank you very much for sharing your code. I am trying to reproduce your results on the IAM database. The training is working fine and the loss is decreasing. However, the attention weights stay at 0 during training.
Do you have an idea of what may be causing this issue ? Here is an example of a test image after ~20 epochs. test_c04-110-03-12_0