jacobnzw / SSMToybox

Nonlinear Sigma-Point Kalman Filters based on Bayesian Quadrature
MIT License
12 stars 0 forks source link

BQTransform: Computation of kernel expectations #13

Closed jacobnzw closed 5 years ago

jacobnzw commented 5 years ago

StudentTProcessModel.exp_model_variance() needs to be computed in every time step because it contains the observation-dependent scaling factor. Currently it's coded so that the routine calls the GaussianProcessModel.exp_model_variance(). While this makes sense theoretically, it is unfortunate, because it is practical only when the model uses RBF-Gauss, which is inexpensive. If we are using RBF-Student however, we need to use large amounts of MC samples to get good approximations to the RBF-Student expectations. Calling this in every time step is insane!

This is a results of a deeper problem. In a broader sense, all the necessary kernel expectations should be pre-computed and saved along with weights as member variables (as done in bq_weights()) so they can be accessed by integral_variance() and exp_model_variance(). This can be done because they only depend on unit sigma-points.

Re-computation of EMV was implemented mainly because at one time I experimented with Bayesian online kernel parameter estimation technique in MarginalInference. It might still be useful for estimation of Bayes-Sard transform parameters.

💡 Use np.random.seed(0) for repeatable results. 💡 RBF-Student expectations could be jitted.