Open lion241 opened 2 years ago
I did not find a clear example but I guess ScaleKernel is the one to use to include spatial dimension. I essentially want to predict t+1 from t. Output variable at t+1 y_{t+1} = g(y_t, x_t1, x_t2 , x_t3 ,x_t4) Ard implies off diagonal terms are zero (as in ard kernel) and not automatic relevance determination.
Can you explain this further?
I can see that I can define a prior on the hyperparameter but how can I do a posterior update. Is there anything build in currently for that purpose ?
See https://docs.gpytorch.ai/en/stable/examples/01_Exact_GPs/GP_Regression_Fully_Bayesian.html.
Finally, do you have something along the lines of kernel search like in this paper: https://arxiv.org/pdf/1302.4922.pdf
No, not currently.
Thank you I will take a look at the fully bayesian notebook in further detail!
The problem I have can be seen as follows: I have one output variable y that is a high variance time series (at least i get high variance with ExactGP after playing around with the kernels).
But also have 6 other variables that are less noisy time series (there is weak correlation between output y and each of the 6 variables but that is understandable because y has high variance).
So i thought i start predicting y_t with an ExactGP (using historical values only) and based on cross validation it has variance in its final accuracy.
So next step is to combine the 6 other variables in addition to the historical data of y_t to make future predictions (spatio-temporal problem).
I have the option of modeling with spatio-temporall as well as multitask (like combos on spatio - temporality). So in my ExactGP model I want to have a vector of 6+1 value (the 6 variables and the previous value of the yt) to predict y{t+1}.
After that I will look into using your Bayesian notebook to start doing averaging across models etc. . The api looks a bit confusing with the use of nuts.
This part of the code in terms of api makes it hard for me to picture the sampling process at the hyperparameter level (where are the mcmc parameters for the proposal distribution and can i simply change mcmc with other sampling methods etc):
def pyro_model(x, y): with gpytorch.settings.fast_computations(False, False, False): sampled_model = model.pyro_sample_from_prior() output = sampled_model.likelihood(sampled_model(x)) pyro.sample("obs", output, obs=y) return y
nuts_kernel = NUTS(pyro_model) mcmc_run = MCMC(nuts_kernel, num_samples=num_samples, warmup_steps=warmup_steps, disable_progbar=smoke_test) mcmc_run.run(train_x, train_y)
I hope that gives more than enough context. It would be nice if you could flesh out that example a bit more or if you could have one more Bayesian example to make it easier for first - starters
📚 Documentation
Is there documentation missing? Hi! Spatio-temporal Exact GP: I am new to this library and I have been checking around on how to use a spatio-temporal GP. I did not find a clear example but I guess ScaleKernel is the one to use to include spatial dimension. I essentially want to predict t+1 from t. Output variable at t+1 y_{t+1} = g(y_t, x_t1, x_t2 , x_t3 ,x_t4) Ard implies off diagonal terms are zero (as in ard kernel) and not automatic relevance determination.
Hyperparameter search posterior update query: I have been looking for an example on doing bayesian inference on hyperparameters. I can see that I can define a prior on the hyperparameter but how can I do a posterior update. Is there anything build in currently for that purpose ?
Finally, do you have something along the lines of kernel search like in this paper: https://arxiv.org/pdf/1302.4922.pdf
Many thanks!!