Open ngphuoc opened 5 years ago
Currently no. We only use derivatives with respect to the parameters for parameter inference. We have been developing some new auto-differentiation tools which will make it easier to differentiate the kernels. Once we have this new functionality, it should be possible to consider derivative observations. I'll add this to the milestones list.
Right, at the moment, this isn't implemented, but it certainly would be useful. The main requirement would be to implement the derivative of the covariance function (just before equation 4 in the paper you cited). I think @jbrea has done this using ForwardDiff
in his BayesianOptimization.jl package, which should work in your case. I think it would be good to have this implemented in GaussianProcesses.jl
. Once that's done, getting predictions of the derivative process would be fairly straightforward, though including derivative observations (as in the paper) requires a bit more architectural thought.
@ngphuoc, I don't think this is going to be at the top of our to-do lists for a while. Would you be able to contribute a PR? I would be happy to help you figure out what's needed.
Ah yes, I've seen @jbrea's use of ForwardDiff. I'll use ForwardDiff for getting the gradient for now.
I wonder if there is an interface for predicting the derivative at input x, something like
Reference: Solak, Ercan, et al. "Derivative observations in Gaussian process models of dynamic systems." Advances in neural information processing systems. 2003.