quancore / social-lstm

Social LSTM implementation in PyTorch
430 stars 256 forks source link

Loss Function Compute #38

Open zhangyanide opened 2 years ago

zhangyanide commented 2 years ago

In the definition of Gaussian2DLikelihood, you calculate the density function, when the result of density function >1 , result = -torch.log(torch.clamp(result, min=epsilon)), this value will <0, the loss < 0. I think the probability value is between 0-1, and the cross entropy should be > 0. Is it right, look forward your reply

llllys commented 2 years ago

计算连续变量的log-loss这件事就很让人疑惑,但原文里也是这个意思,不太理解...而且概率密度函数中一个点的值并没有什么意义