SheffieldML / GPy

Gaussian processes framework in python
BSD 3-Clause "New" or "Revised" License
2.01k stars 557 forks source link

Importing GP-mat code to GPy #835

Open danivela opened 4 years ago

danivela commented 4 years ago

Hi,

I am actually in a project in my home university trying to export a script programmed in Matlab language using GP-mat into a python code script using GPy. I am having problems trying to understand this step:

_optionsGP = gpOptions('ftc'); optionsGP.kern = {'nse','log','bias'}; optionsGP.posRef=postx_est; modelGP = gpCreate(size(Xtrain,2), size(ytrain,2), Xtrain, ytrain-mtrain, optionsGP); modelGP.beta=1./transpose((sigma_u+noise_sigmau)^2./dtxNoise(1:Mon).^2+(varnoise+noisevarnoise)*ones(Mon,1)); display=0; modelGP = gpOptimise(modelGP, display, 1000);

I do not know how the kernel is created and what is the meaning of the beta parameter but I need to implement the same kernel and model in GPy. I spent a huge amount of time trying to overcome this impediment and I would be very grateful if I received any help from you.

adamian commented 4 years ago

Hi,

I haven't used GP-mat in a while so I might have forgotten, but I can't recognize 'nse' option for kernel (line 2) or optionsGP.posRef option. If it is a custom extension of GP-mat, you might need to speak to the corresponding author of that code.

In any case, if it helps I can provide some comments in the code you included above:

% Create a GP with "Full Training Conditional", i.e. no approximations. In GPy, this means use the standard GPRegression, not any inducing point or other approximations. 
optionsGP = gpOptions('ftc'); 
% Your kernel is an addition of three kernels, 'nse', 'log', and 'bias' (I don't know what's the nse kernel). In GPy, you can achieve the same by adding kernels using the overloaded '+' operator. 
optionsGP.kern = {'nse','log','bias'};
optionsGP.posRef=postx_est;  
% Create GP with the options above, where ytrain-mtrain presumably zero-means the data 
modelGP = gpCreate(size(Xtrain,2), size(ytrain,2), Xtrain, ytrain-mtrain, optionsGP);  
% modelGP.beta if i remember correctly is the inverse of what GPy calls model.Gaussian_noise.variance. I'm not sure whether in GPmat this has any effect if you combine it with the 'ftc' (full) GP.  
modelGP.beta=1./transpose((sigma_u+noise_sigmau)^2./dtxNoise(1:Mon).^2+(varnoise+noise_varnoise)*ones(Mon,1));
display=0; 
% Optimize the model for 1000 iterations 
modelGP = gpOptimise(modelGP, display, 1000);
danivela commented 4 years ago

Thank you so much for your answer. It was really useful for my project.