cornellius-gp / gpytorch

A highly efficient implementation of Gaussian Processes in PyTorch
MIT License
3.46k stars 546 forks source link

Custom Preconditioning Strategies #1997

Open sgalee2 opened 2 years ago

sgalee2 commented 2 years ago

I am looking at alternative preconditioning strategies for GP regression, I was wondering how easy it would be to fit in custom preconditioners into the GPyTorch framework so I can train GPs in the same fashion as standard within GPyTorch just only having changed preconditioner from Pivoted Cholesky to a custom made one?

Would it be better to build my own models using GPyTorch and fit the preconditioners in there, or can I slot in my own code into the already written Lazy Tensor classes? I'm unsure how to tackle this & would greatly appreciate any opinion.

I should also add that while most of the preconditioners would follow the standard form

P = A A T + D,

this isn't a guarantee for all strategies. Would this pose a problem should I attempt to fit the new strategies into the current code?

Best, Adam

wjmaddox commented 2 years ago

You can currently modify the preconditioner for systems that you're looking at via the preconditioner_override flag in AddedDiagLazyTensor (see #930 for some discussion). See https://docs.gpytorch.ai/en/stable/lazy.html#addeddiaglazytensor for some documentation of that capability.