Closed sgbaird closed 1 year ago
It seems like there is a lot of stuff going on in the papers that you mentioned, using and combining a number of different approaches. I am not sure the term "structured GP" is universally accepted - seems like this is used kind of a like a catch-all for GPish models that incorporate domain knowledge. The place I usually see this used is in exploiting structure in kernel / covariance matrices in order to speed up inference.
As to the more general interpretation, there are lots of flavors of this. We have done some of this on our end, including things like DKL and latent-space BO (@Ryan-Rhys has done lots of this), semiparametric GP models (@bletham), and transfer-learning type modeling in which the a model fit across a bunch of related data is included as an informative prior.
At a high level, the more domain knowledge you have and the more specific you gan get the better a model you can construct for the specific use case at hand. From an implementation perspective, most of the basic concepts for this exist in botorch, but it's rather hard to build a generic interface for these kinds of models at the level of Ax. As always it's a tradeoff between customizability and usability.
@Balandat thanks for the great info, and thanks for taking a look at some of the resources I linked.
It seems like there is a lot of stuff going on in the papers that you mentioned, using and combining a number of different approaches. I am not sure the term "structured GP" is universally accepted - seems like this is used kind of a like a catch-all for GPish models that incorporate domain knowledge. ...
Good to know about the terminology.
... The place I usually see this used is in exploiting structure in kernel / covariance matrices in order to speed up inference.
Interesting - I might need to take a look.
As to the more general interpretation, there are lots of flavors of this. We have done some of this on our end, including things like DKL and latent-space BO (@Ryan-Rhys has done lots of this), semiparametric GP models (@bletham), ...
@Ryan-Rhys I'm familiar with the BoTorch VAE+BO example. Any resources for DKL and other latent-space BO that you've worked with and might be able to share? @bletham, also interested in the semiparametric GP models. Had trouble finding an example in Ax or BoTorch.
and transfer-learning type modeling in which the a model fit across a bunch of related data is included as an informative prior.
Multi-task and multi-fidelity, contextual variables, and custom featurizers definitely come to mind in this respect. https://github.com/facebook/Ax/issues/1038. Also nonlinear constraints https://github.com/facebook/Ax/issues/153. Related discussion on domain knowledge https://github.com/facebook/Ax/issues/828.
At a high level, the more domain knowledge you have and the more specific you gan get the better a model you can construct for the specific use case at hand. From an implementation perspective, most of the basic concepts for this exist in botorch, but it's rather hard to build a generic interface for these kinds of models at the level of Ax. As always it's a tradeoff between customizability and usability.
Agreed!
closing as discussion.
I probably lack the understanding and the language required to talk about this effectively, so here are a few follow-up questions.
From my basic understanding, it's functionally similar to performing BO over a VAE latent space, except that the latent space embeddings aren't entirely as fixed, and the manifold itself is learned based on what a deep kernel learning (?) model decides is "useful" or not. On a higher level, I've been told it's useful for incorporating physical insight/domain knowledge (e.g. physical models) into active learning.
I'm asking based on some discussion with Sergei Kalinin on DKL models they've been applying in microscopy settings and how it applies to other domains. See e.g. https://arxiv.org/abs/2205.15458
Related:
from Twitter search of deep kernel learning
From https://github.com/ziatdinovmax/gpax:
Feel free to close as this is just a discussion post, and no worries if this doesn't fit well within the scope of Ax/BoTorch. Curious to hear your thoughts, if any!