SheffieldML / GPy

Gaussian processes framework in python
BSD 3-Clause "New" or "Revised" License
2.02k stars 560 forks source link

Getting the posterior mean and covariance matrix in binary classification tasks. #969

Open tom192180 opened 2 years ago

tom192180 commented 2 years ago

Hello! I read #700 and it is doable to find the mean and covariance matrix of p(f|X, y, x) in regression tasks. Where f is latent function for test data; x is input of test data; X and y are input and label for train data.

If I understand correctly, The code for the covariance matrix and mean for p(f|X, y, x) is: model.predict(test_x.reshape(-1,1), full_cov=True, include_likelihood=False)

The code for the covariance matrix and mean for p(y|X, y, x,f) is: model.predict(test_x.reshape(-1,1), full_cov=True, include_likelihood=True) where y is prediction of new y

I would like to ask in binary classification, does the mean and covariance matrix of p(f|X, y, x) also use the same code in GPy? Because p(f|X, y, x) is further transform into probabilistic function by doing \int sigmoid p(f|X, y, x) df with respect to f*.

Also, I found if the model is classier, model.predict(test_x.reshape(-1,1), full_cov=True, include_likelihood=True) did not give mean.

Thank you in advances!

tom192180 commented 2 years ago

Hey I tried myself recently and I found when using model.predict(test_x.reshape(-1,1), full_cov=True, include_likelihood=None) it will give the covariance and mean of latent f(x*), so problem solved!