Open metagenAu opened 1 year ago
Hi @metagenAu, the negative binomial likelihood is very sensitive to unfavourable hyperparameter combinations from your search space, it can cause the observed behaviour where the optimisation fails and the training is aborted. My advise is that you revise the hyperparameter ranges and decrease/increase the upper/lower limits of the regularization hyperparamters.
Hey guys,
I'm getting the following error when I tried to tune a negative binomial sjSDM. The error didn't crop up straight away .
See below:
Error in checkForRemoteErrors(val) : one node produced an error: ValueError: Expected parameter probs (Tensor of shape (100, 20, 395)) of distribution NegativeBinomial(total_count: torch.Size([100, 20, 395]), probs: torch.Size([100, 20, 395])) to satisfy the constraint HalfOpenInterval(lower_bound=0.0, upper_bound=1.0), but found invalid values: <... omitted ...> [[9.9990e-01, 9.9345e-01, 9.9990e-01, ..., 9.9990e-01, 9.9927e-01, 9.9990e-01], [6.4720e-02, 7.1444e-01, 1.2542e-01, ..., 1.0226e-04, 4.5144e-03, 8.4232e-04], [9.8830e-01, 1.1321e-02, 8.7012e-01, ..., 1.7798e-03, 1.0024e-04, 9.9990e-01], ..., [9.9990e-01, 9.9990e-01, 1.0000e-04, ..., 9.9990e-01, 9.9990e-01, 9.9990e-01], [9.9990e-01, 9.9990e-01, 1.0000e-04, ..., 9.9856e-01, 9.9990e-01, 9.9939e-01], [9.9990e-01, 9.9990e-01, 1.1007e-04, ..., 9.8450e-01, 3.0850e-04, 9.9990e-01]]], grad_fn=<ClampBackward1>) See
reticulate::py_last_err`Cheers Chris