Closed wangell closed 5 years ago
You can't draw samples of random functions from the posterior unless you specify the points at which you want to evaluate those random functions.
To sample from a GP's posterior evaluated at points test_x
, do
# Set into posterior mode
model.eval()
likelihood.eval()
preds = likelihood(model(test_x))
preds.sample()
where test_x
is N
x D
if you're not using batched GPs. preds
is a multivariate Gaussian distribution corresponding to your GP posterior at those points.
Very small addendum to @KeAWang's answer: if you need to backpropagate through the samples for any reason, be sure to use rsample
instead of sample
.
Are you trying to implement TS for bandits or BayesOpt? If it's for BayesOpt we have some code laying around internally which we can add to BoTorch if you'd like, @wangell
I'm going to close this unless someone mentions they are still having trouble with this.
@eytan That would be awesome
@eytan I'd be very interested in your TS approach for BayesOpt. I only know of the approximate spectral sampling approach for stationary kernels.
@DavidWalz Spectral sampling would work, what we've been doing mostly is just a discrete version that's based on drawing joint samples on a (large) discretization of the domain. That works pretty well in reasonably small dimensions and is very fast given how gpytorch exploits fast predictive variances & batched computation. Here is an old PR for this that I hope to clean up some time soon: https://github.com/pytorch/botorch/pull/218/
📚 Documentation/Examples
I'm trying to implement Thompson sampling - is there an implemented way to sample a function from the model posterior without an input? i.e. f = model.sample(); f(X)
Similar to this https://math.stackexchange.com/questions/1218718/how-do-we-sample-from-a-gaussian-process