Unfortunately, NegativeBinomialDistributionLoss no longer works with TFT in pytorch-forecasting 0.10.3.
The error started after Google Colab upgraded to python 3.8 causing my notebooks to automatically start running pytorch-forecasting 0.10.3. Downgrading to python 3.7 and pytorch-forecasting 0.10.1 causes the error to go away.
Hello,
Unfortunately, NegativeBinomialDistributionLoss no longer works with TFT in pytorch-forecasting 0.10.3.
The error started after Google Colab upgraded to python 3.8 causing my notebooks to automatically start running pytorch-forecasting 0.10.3. Downgrading to python 3.7 and pytorch-forecasting 0.10.1 causes the error to go away.
I have re-created the error in this colab notebook using the TFT tutorial code slightly adjusted to switch to NegativeBinomialDistributionLoss.
The same error also occurs on other datasets where I am running TFT with NegativeBinomialDistributionLoss.
Error message: ValueError: Expected parameter total_count (Tensor of shape (350, 6)) of distribution NegativeBinomial(total_count: torch.Size([350, 6]), probs: torch.Size([350, 6])) to satisfy the constraint GreaterThanEq(lower_bound=0), but found invalid values: tensor([[ 27.3240, 25.7105, 25.1888, 25.4835, 28.0047, 18.8410], [ 24.8063, 23.3642, 22.9043, 23.0588, 25.0264, 16.8969], [ 18.9572, 17.7293, 17.4891, 17.1998, 19.3951, 12.7476], ..., [124.7026, 119.7774, 117.8373, 119.3506, 121.7064, 95.2277], [ 24.9350, 23.9966, 23.5536, 25.0558, 24.9452, 20.1810], [ nan, nan, nan, nan, nan, nan]])
Originally posted by @jmiller558 in https://github.com/jdb78/pytorch-forecasting/issues/339#issuecomment-1334621746