cornellius-gp / gpytorch

A highly efficient implementation of Gaussian Processes in PyTorch
MIT License
3.56k stars 559 forks source link

[Docs] Sampling function from posterior for Thompson sampling? #757

Closed wangell closed 5 years ago

wangell commented 5 years ago

📚 Documentation/Examples

I'm trying to implement Thompson sampling - is there an implemented way to sample a function from the model posterior without an input? i.e. f = model.sample(); f(X)

Similar to this https://math.stackexchange.com/questions/1218718/how-do-we-sample-from-a-gaussian-process

KeAWang commented 5 years ago

You can't draw samples of random functions from the posterior unless you specify the points at which you want to evaluate those random functions.

To sample from a GP's posterior evaluated at points test_x, do

# Set into posterior mode
model.eval()
likelihood.eval()

preds = likelihood(model(test_x))
preds.sample()

where test_x is N x D if you're not using batched GPs. preds is a multivariate Gaussian distribution corresponding to your GP posterior at those points.

jacobrgardner commented 5 years ago

Very small addendum to @KeAWang's answer: if you need to backpropagate through the samples for any reason, be sure to use rsample instead of sample.

eytan commented 5 years ago

Are you trying to implement TS for bandits or BayesOpt? If it's for BayesOpt we have some code laying around internally which we can add to BoTorch if you'd like, @wangell

jacobrgardner commented 5 years ago

I'm going to close this unless someone mentions they are still having trouble with this.

wangell commented 5 years ago

@eytan That would be awesome

DavidWalz commented 4 years ago

@eytan I'd be very interested in your TS approach for BayesOpt. I only know of the approximate spectral sampling approach for stationary kernels.

Balandat commented 4 years ago

@DavidWalz Spectral sampling would work, what we've been doing mostly is just a discrete version that's based on drawing joint samples on a (large) discretization of the domain. That works pretty well in reasonably small dimensions and is very fast given how gpytorch exploits fast predictive variances & batched computation. Here is an old PR for this that I hope to clean up some time soon: https://github.com/pytorch/botorch/pull/218/