Closed Rosarubu closed 3 years ago
A good suggestion. This will potentially lead to problems. But in my opinion, filling the scores with a tiny value is not suitable because scores in other positions can be negative numbers, in which case a tiny value will not ensure a zero after softmax.
We can simply mask the nan with zero after softmax to solve this problem:
scores = scores.masked_fill(torch.isnan(scores), 0)
I will add this in the next commit :)
Hi, thanks for the great job. I found that in https://github.com/THUwangcy/ReChorus/blob/dba1d0bd7b6d7296ed6c730793e0f61278007dc2/src/utils/layers.py#L53 when you filled the scores with '-np.inf' will cause 'nan' in result if all the input vector is masked. Use a tiny value like 1e-13 will solve this problem.