cornellius-gp / gpytorch

A highly efficient implementation of Gaussian Processes in PyTorch
MIT License
3.57k stars 562 forks source link

Constraint documentation #2242

Closed JackBuck closed 1 year ago

JackBuck commented 1 year ago

šŸ“š Documentation/Examples

Would you be able to provide a little more documentation on how constraints work in gpytorch? I understand from the documentation that constraints are implemented by training raw parameters which, when transformed, correspond to parameters which do satisfy the constraints. However, I notice that there is an undocumented[^1] feature for whether a constraint is enforced. Specifically, if I set transform=None when creating the constraint then the enforced property will return False and calling transform() or inverse_transform() on the constraint will return the input unchanged. As far as I can tell, the effect is that the raw parameter and the transformed parameter are identical.

My questions are:

For additional context, I came across this feature in botorch which sets transform=None on the noise constraint if a likelihood is not passed when creating a SingleTaskGP:

likelihood = GaussianLikelihood(
    noise_prior=noise_prior,
    batch_shape=self._aug_batch_shape,
    noise_constraint=GreaterThan(
        MIN_INFERRED_NOISE_LEVEL,
        transform=None,
        initial_value=noise_prior_mode,
    ),
)

https://github.com/pytorch/botorch/blob/c2e502c5eb9cf266875edf92024521235c69f64e/botorch/models/gp_regression.py#L132

[^1]: At least, I didn't find the documentation when I searched!

gpleiss commented 1 year ago

We are currently working on it :) See #2252

gpleiss commented 1 year ago

I'm going to close this in favor of #640 (duplicate issue)

Balandat commented 1 year ago

I guess that PR doesn't really address @JackBuck's question about what happens when setting transform=None (unless I missed something in #2252).

Basically the idea of assigning a constraint with transform=None is that instead of using Adam or some other stochastic first-order optimizer on an unconstrained problem (with constraints enforced via the transform) you may want to use a different optimizer (such as L-BFGS-B) that imposes the constraints directly on the parameters by explicitly expressing the constraint as part of the optimization problem. In botorch this is done e.g. in the fit_gpytorch_mll_scipy method here: https://github.com/pytorch/botorch/blob/main/botorch/optim/fit.py#L109

JackBuck commented 1 year ago

Thank very much @Balandat - that answers my question perfectly. :+1: