Open lukas-mazur opened 4 years ago
But I'm not sure what I should use here? Just LazyTensor(self.noise) ?
Yep, that should be mostly it. There may be some other assumptions in the code that we'll have to trace down, but this should be the main thing.
But I'm not sure what I should use here? Just LazyTensor(self.noise) ?
Yep, that should be mostly it. There may be some other assumptions in the code that we'll have to trace down, but this should be the main thing.
Thanks for the fast reply! Unfortunately it doesn't work, since LazyTensor has abstract methods. I'm getting this error:
File "/home/user/gpytorch/gpytorch/likelihoods/noise_models.py", line 150, in forward return LazyTensor(self.noise) TypeError: Can't instantiate abstract class LazyTensor with abstract methods _matmul, _size, _transpose_nonbatch
I have implemented it in this way:
if noise is not None:
return DiagLazyTensor(noise)
elif shape[-1] == self.noise.shape[-1] and len(shape) == len(self.noise.shape):
return DiagLazyTensor(self.noise)
elif (shape[-1] == self.noise.shape[-1] and shape[-1] == self.noise.shape[-2]
and (len(shape) + 1) == len(self.noise.shape)):
return LazyTensor(self.noise)
else:
return ZeroLazyTensor()
Is there a non-abstract class available for that?
There is NonLazyTensor
if you want a full noise covariance.
There is
NonLazyTensor
if you want a full noise covariance.
Thanks! With that it runs! However, with the first iteration I get this warning:
/home/user/gpytorch/gpytorch/utils/cholesky.py:44: NumericalWarning: A not p.d., added jitter of 1e-08 to the diagonal warnings.warn(f"A not p.d., added jitter of {jitter_new} to the diagonal", NumericalWarning)
Apparently, the full covariance matrix is not positive definite anymore. My noise matrix is p.d. I check that right before I pass it to the likelihood:
try:
test = torch.cholesky(noise_covar, upper=False)
except RuntimeError as e:
print("Error! noise_covar is not p.d.")
exit()
likelihood = gpytorch.likelihoods.FixedNoiseGaussianLikelihood( noise=noise_covar,
batch_shape=self.batch_size)
What else could be the reason?
Not sure. the covar being p.s.d. and the noise being p.d. means that that the sum will be p.d.. Probably your noise level is quite low relative to the scale of the covariance so the sum is really ill-condiitoned and cholesky fails. A jitter of 1e-8 should be fine though. Note that if you see this warning consistently then things will unfortunately be quite slow b/c of pytorch error handling: https://github.com/pytorch/pytorch/issues/34272
Not sure. the covar being p.s.d. and the noise being p.d. means that that the sum will be p.d.. Probably your noise level is quite low relative to the scale of the covariance so the sum is really ill-condiitoned and cholesky fails. A jitter of 1e-8 should be fine though.
That could be possible. Does the jitter has the same effect as adding additional noise? If I allow to learn additional noise I do not get this warning at all.
likelihood = gpytorch.likelihoods.FixedNoiseGaussianLikelihood( noise=noise_covar,
batch_shape=self.batch_size
learn_additional_noise=True)
Note that if you see this warning consistently then things will unfortunately be quite slow b/c of pytorch error handling: pytorch/pytorch#34272
I get this warning only once in the first iteration using a RBF kernel. If I use a RQ Kernel it happens at some random iteration, but still only once.
🚀 Feature Request
The FixedNoiseGaussianLikelihood class accepts a noise vector of size n, which is then I guess projected to a n x n diagonal matrix. How can I modify that class so that it directly accepts a non-diagonal n x n matrix instead of the vector of size n?
Motivation
For my data I do not only have the observed noise, but also the information on how strongly correlated some of the data points x_i, x_j are. I want to include that information in the regression.
Pitch
I assume I have to modify the class FixedGaussianNoise in noise_models.py and add another elif statement here:
But I'm not sure what I should use here? Just LazyTensor(self.noise) ?
Are you willing to open a pull request? (We LOVE contributions!!!) -> Yes, sure!