Closed 77281900000 closed 5 years ago
Loss2 is the negative log probability of the prior p(\sigma2). If you choose to train \sigma2 this term will matter. The reason we count non-zeros is that since rnn_truth is a collection of different length sequences padded with zeros to fit them into a tensor. We are just counting the sum of 'real lengths' of all sequences.
Thanks Aonan for explaining. Closing this issue now. Please re-open it if you have more questions.
During the fit process,the loss has 3 parts.Could anyone tell me what's the meaning of loss2?And why counts the number of non-zero(the code is just above calculating loss2)?