summerlvsong / Aggregation-Cross-Entropy

Aggregation Cross-Entropy for Sequence Recognition. CVPR 2019.
303 stars 60 forks source link

Need Help! Loss nan #2

Closed allen4747 closed 5 years ago

allen4747 commented 5 years ago

for this line: torch.log(input)

The 'input' is the softmax score (0-1). If k-th class does not show in an input, the accumulative softmax score of all time steps for k-th class is very likely to be 0. Then this will result into torch.log(input) = nan.

How do you make sure that 'input' does not equal to 0 for 'torch.log(input)'

summerlvsong commented 5 years ago

NAN problem fixed. Please refer to line 34 of 'source/models/seq_module.py'.

allen4747 commented 5 years ago

NAN problem fixed. Please refer to line 34 of 'source/models/seq_module.py'.

Thanks!

TangDL commented 2 years ago

make sure input > 0