jsjol / diGP

BSD 3-Clause "New" or "Revised" License
0 stars 0 forks source link

Check how GPy handles multi-task learning and prediction. #13

Closed jsjol closed 7 years ago

jsjol commented 7 years ago

cf: http://nbviewer.jupyter.org/github/SheffieldML/notebook/blob/master/GPy/coregionalized_regression_tutorial.ipynb

Also look at: GPy.models.gp_kronecker_gaussian_regression.py

jsjol commented 7 years ago

I think coregionalized regression is not what we want to do. Instead its seems much simpler to create a kernel using the active_dims argument. Example: k1 = GPy.kern.Linear(input_dim=1, active_dims=[0]) # works on the first column of X, index=0 k2 = GPy.kern.ExpQuad(input_dim=1, lengthscale=3, active_dims=[1]) # works on the second column of X, index=1 k = k1 * k2

jsjol commented 7 years ago

Computationally, it would be super convenient if the spatial covariance can be factorized like e.g. the RBF kernel. Then the covariance matrix can be written as Kronecker products of the respective coordinates covariance functions. See Wilson (2014) p. 63 for the theory. Check if the code in GPy/kern/src/grid_kerns.py is applicable.

jsjol commented 7 years ago

The method of Stegle et al. (2011), implemented in GPy.models.gp_kronecker_gaussian_regression.py is the way to go. The main computation is symmetric eigenvalue decomposition of spatial covariance and feature covariance, treated separately.

It will work out-of-the-box for arbitrary spatial covariance matrix. Very minor adjustments are required if you want to leverage special structure (which is of special interested for the spatial covariance). For example,

jsjol commented 7 years ago

Now I think that to utilize the Kronecker structure, it would be more appropriate to use one of GPy's methods for gridded regression.

jsjol commented 7 years ago

I believe this is resolved by my extensions to GPy.