Closed tlienart closed 5 years ago
Sorry for the slow response. Thanks for raising the issue regarding scikitlearn.jl. When we first incorporated scikitlearn a few years ago we only included some basic functionality and over time we haven't really maintained our interface with scikitlearn. A failing on our part. I'm not familiar with predict_proba
so I couldn't say for sure.
Thanks, the question arises from trying to develop a kind of MLR in Julia (https://github.com/alan-turing-institute/MLJ.jl) where we're currently writing a bunch of interfaces to existing julia packages out there.
In the case of Gaussian Processes for instance it seems the package can be used for classification if I'm not mistaken as per one of the notebook, and in that case one may want to predict a probability of being in a class for each class rather than just the most likely class. Do you know whether I could do this with the package?
Thanks!
MLJ.jl) looks like an interesting project. Please do let us know if you have any suggestions for how we can make GaussianProcesses.jl interface better with other packages.
With regards to your questions, yes, you can extract the probability. In this example, you have a Bernoulli likelihood where the probability of success is the probit transformed latent GP function (see https://github.com/STOR-i/GaussianProcesses.jl/blob/master/src/likelihoods/bernoulli.jl). As the mean of a Bernoulli dist. is the success probability, you could use the function mean_lik
, or simply extract the GP function and apply your own probit/logit transformation. If you think that this could be a useful feature, we could modify the interface so that extracting the probability is more user-friendly.
At the moment, we've only implemented binary classification for GPs, however, we are working to extend this to multi-class classification.
Thanks that's great, I'll try using mean_lik
.
If you think that this could be a useful feature, we could modify the interface so that extracting the probability is more user-friendly.
I would think that would be great yes, especially for people who may be less familiar with GPs. Either that or a short example in the current classification
notebook.
At the moment, we've only implemented binary classification for GPs, however, we are working to extend this to multi-class classification.
Thanks for taking the time to answer, I think it closes the issue for now (feel free to re-open if you think it might serve as a good "todo" item).
Hello, I noticed that the
ScikitLearn.jl
interface only implementspredict
viapredict_y
of aGPE
. However it seems to me that GaussianProcesses should naturally correspond to apredict_proba
since they're a probabilistic method (?).For a classifier, in the
Classification
notebook, you use aGPMC
could this be used to correspond to thepredict_proba
ofScikitLearn
?Thanks!