Closed evanmunro closed 4 years ago
Hi Evan,
You're quite right, there are some missing pieces here. I implemented those functions but didn't take the time to think about the user API, and wanted to avoid bloating the package prematurely. You've got the right idea, you would just need to create a differentiable function for Optim, following the template of optimize.jl.
I've just created an example notebook. In the last section I copy-pasted the relevant code from optimize.jl
and substituted in the cross-validation functions. This is very brute-force, and obviously not something we would want to add to the package as is.
Sorry for taking so long to respond, I hope this wasn't a blocker for you.
Hi,
I'm interested in optimizing the kernel hyperparameters with respect to the LOO log-probability criterion. I was looking in the documentation at the Cross-Validation functions implemented, and I should be able to use
GaussianProcesses.logp_LOO
andGaussianProcesses.dlogpdθ_LOO
in combination withOptim.jl
in order to do this optimization. However, it doesn't look like these functions are exported from the package, which suggests that there is an option somewhere else that would allow me to fit my GP model based on the functions implemented incrossvalidation.jl
. Could you point me in the right direction? Is there an example written somewhere that shows how to use the nice functionality implemented incrossvalidation.jl
?Thanks very much!