dflemin3 / approxposterior

A Python package for approximate Bayesian inference and optimization using Gaussian processes
https://dflemin3.github.io/approxposterior/
MIT License
41 stars 9 forks source link

Scaling parameter values to improve GP hyperparameter optimization #38

Closed dflemin3 closed 5 years ago

dflemin3 commented 5 years ago

In the original BAPE algorithm paper, Kandasamy+2015 scaled model parameter values between [0,1] using the appropriate simple linear transformation. Performing this scaling in approxposterior could be useful for convergence and numerical stability issues by keeping parameter values in a reasonable range, especially for metric scales.

This can be implemented without too much difficulty using the sklearn preprocessing module, e.g. the MinMaxScaler. Furthermore, the sklearn codebase is well-tested and robust, so it's inclusion shouldn't introduce too many dependency issues.

To do this, I could either use the bounds kwarg that stipulates the hard bounds for model parameters, or I could train the scaler on the GP's initial theta, although I think the former idea is more desirable relative to the latter.

dflemin3 commented 5 years ago

Added the ability to scale parameters between (0,1) and am working on more robust scaling (#44) on the dev branch.