Closed Lazloo closed 5 years ago
Sorry for the delay on this -- I'm still traveling so computer access is a bit sparse!
@gpleiss If AdditiveStructureKernel
is intended to work in batch mode, it definitely isn't -- it's summing over both the batch and dimension portion of the batch.
@Lazloo In the meantime, with 2D data you really shouldn't want additive structure. This model works fine on that data:
class MultitaskGPModel(gpytorch.models.ExactGP):
def __init__(self, train_x, train_y, likelihood):
super(MultitaskGPModel, self).__init__(train_x, train_y, likelihood)
self.mean_module = gpytorch.means.ConstantMean()
grid_size = gpytorch.utils.grid.choose_grid_size(train_x, kronecker_structure=True)
print(grid_size)
self.covar_module = gpytorch.kernels.GridInterpolationKernel(
gpytorch.kernels.ScaleKernel(
gpytorch.kernels.MaternKernel(nu=1.5),
), grid_size=int(grid_size), num_dims=train_x.shape[-1]
)
def forward(self, x):
mean_x = self.mean_module(x)
covar_x = self.covar_module(x)
return gpytorch.distributions.MultivariateNormal(mean_x, covar_x)
If your actual data is higher dimensional, I would recommend using either DKL or InducingPointKernel
.
I have a feeling this bug with go away with multi-batch LazyTensors (and subsequent updates to models and kernels). There is a plan (but not short-term plan) to fix this all, but it does require some major refactoring within GPyTorch.
I agree with Jake - that for the time being additive structure shouldn't be necessary for 2D data, and so you can get away with not using it.
Hey,
I would like to perform "Scalable GP Regression" for a data set with multiple output variables. I use the following data set:
An based on the tutorial I try to train the model usign the following code:
However I get the following error message:
Can someone help?