Open oteym opened 3 years ago
You can definitely do it, but you'll need to implement the model as described in §2.7 - I don't think GPy has an implementation of this model as-is i.e. I don't see an implementation in https://github.com/SheffieldML/GPy/tree/devel/GPy/models.
But yeah, it shouldn't be too much more than implementing (2.41) for prediction and (2.44) for the likelihood function in a new class VagueLinearPriorGP(GP):
(or whatever you choose to name it).
For those reading, the "GPML book" is Rasmussen & Williams (2006) and it's free online at http://www.gaussianprocess.org/gpml/
Poking around the tutorials a bit, this one looks like it will get you a step closer: https://nbviewer.jupyter.org/github/SheffieldML/notebook/blob/master/background/BayesianLinearRegression.ipynb
Is it possible to easily implement a non-zero mean function with an (improper) vague prior on coefficients of explicit basis functions, cf. Section 2.7 in the GPML book. Specifically, predictions to be generated using the formula (2.42) rather than ‘naively’, for the reasons explained there. I’ve dug about quite a bit through the package codebase but I’m a bit lost. Or if not maybe there is some workaround?
Many thanks. (In fact many thanks for the wonderfully useful package in general.)