Open irinaespejo opened 1 year ago
What happens when you z-score your x values (so transform them to be zero mean, unit variance)? This usually fixes these kinds of numerical instabilities.
Sorry, forgot to mention that training and evaluation are done with data re-scaled to the hypercube not z-score tho. Will try z-score. Thanks!
Hmm I didn't realize the data are rescaled to the hypercube. How many data points are you using? And is the data 1 dimensional?
Sorry for the long reply, the data X is 4Dand target y is 1-D. There are approx. 6000 points for task #1 and 1000 points for task #2. The plots that I show are 1-D slices fixing 3 features of X.
I will try to take a look at the issue this weekend.
@irinaespejo can you please post a fully run-able code example - i.e. something that I can copy-paste into a script and reproduce the results that you see?
Hello 👋
Changing
gpytorch.settings.eval_cg_tolerance()
during evaluation produces a dramatic change in the predicted variance on a Multitask GP wit Gaussian Likelihood. I changed only the default tolerance 1E-2 to 1E-6 and I obtained the following posterior variance plots. Is this expected behavior?Reproduce the issue
The model
The code
Pre-trained model with noise covariance = 0.1 from GaussianLikelihood Data and model dict attached
Expected behavior
I expected evaluating with the default CG tolerance 1E-2 would give results according to the white noise hyper parameter = 0.1
Thank you!
data.zip