acerbilab / pyvbmc

PyVBMC: Variational Bayesian Monte Carlo algorithm for posterior and model inference in Python
https://acerbilab.github.io/pyvbmc/
BSD 3-Clause "New" or "Revised" License
114 stars 6 forks source link

Bounds for gp hyperparameters #99

Closed pipme closed 1 year ago

pipme commented 2 years ago

The default recommended bounds for gp from gpyreg may be not appropariate in some cases when we have a large range of point values, e.g. the train set contains one point with 1e3 density value and another one with 1e-18 in extreme cases. In these cases, it's better to set bounds from high posterior region. Like below:

https://github.com/lacerbi/pyvbmc/blob/670fa0ce85214436923a31ee91d685e176824119/pyvbmc/vbmc/gaussian_process_train.py#L1177-L1181 https://github.com/lacerbi/pyvbmc/blob/670fa0ce85214436923a31ee91d685e176824119/pyvbmc/vbmc/gaussian_process_train.py#L1245-L1246

After that np.inf will be replaced by gpyreg's recommended bounds (link), overall it gives a broader bounds on the gp's outputscale and lengthscale.

Also, maybe here it's also good for pyvbmc to use the PLB and PUB computed from high posterior region, to generate random samples, though I haven't tried that.

Btw, it's a bit confusing right now how the specified prior, bounds in pyvbmc and the methods gp.set_bounds, gp.get_recommended_bounds in gpyreg interact with each other. It takes me a while to understand it. Maybe better to add docs or structure a bit the code.

lacerbi commented 2 years ago

Thanks @pipme. We'll have to discuss.

I think there is something wrong in how gpyreg is handling the default bounds. The element to set default bounds should be np.nan, not np.inf (otherwise when reading the code above I'd think that you are setting the bound to infinity, not to the default).

pipme commented 1 year ago

Fixed in #116.