Closed ElOceanografo closed 3 years ago
Thanks for opening this.
Any thoughts on the interface or structure of the types?
The above looks really good -- I prefer the design to the one that I proposed on discourse.
Besides rand and logpdf, what other methods does this new type need?
Yes, there are a few methods of mean
and cov
that should just fall back to fobs
. Basically just look for methods in src/abstract_gp.jl
that accept FiniteGP
s and implement them :)
Where should this code go? In src/abstract_gp.jl?
We'll have quite a lot of approximate inference stuff one this has been added, so maybe it makes sense to create a new file called src/approximate_inference.jl
or something, put this there, and move elbo
-related stuff from src/abstract_gp.jl
into that file?
What tests are needed--in particular, what comparisons should there be between the sparse and dense GPs?
I would just check that mean
, cov
, and rand
(with identical seeds) produce the same thing in all cases, and that elbo
with regular FiniteGPs
and logpdf
under SparseFiniteGP
s produce the same answer.
Ok, that all sounds good. PR on the way.
A continuation/focusing of this discussion on Discourse. Defining the following structure and methods allows using a sparse GP in a Turing model:
An example model is shown below. Using the
SparseFiniteGP
speeds up inference dramatically for a simulated dataset with ~1000 observations.Some questions to discuss while I put together a PR...
rand
andlogpdf
, what other methods does this new type need?src/abstract_gp.jl
?