ppgaluzio / MOBOpt

Multi-objective Bayesian optimization
MIT License
81 stars 23 forks source link

GP Kernel #4

Open RodrigoAVargasHdz opened 4 years ago

RodrigoAVargasHdz commented 4 years ago

Hi,

I like your code :)

I was wondering if you can implement the use of kernels with individual parameters for each feature. This tends to make the GPs more accurate.

Thanks :)

ppgaluzio commented 4 years ago

Hey, thanks!

I guess I could do it by including an extra parameter at the init method of the class, in which case you could give a kernel object to the optimizer with the parameter of your choice. This way the interface wouldn't be changed and the user has more flexibility in the implementation.

RodrigoAVargasHdz commented 4 years ago

Hi!

Yeah you could do something like this,

        self.GP[i] = GPR(length_scale=np.ones(self.x_dim), 
                        kernel=C(1.0, (1e-3, 1e3)) * Matern(nu=2.5), 
                         n_restarts_optimizer=self.n_rest_opt)

where x_dim is the number of features and C is the constant kernel. May have to declare it,

     from sklearn.gaussian_process.kernels import Matern, ConstantKernel as C 

Thanks!

RodrigoAVargasHdz commented 4 years ago

I had a mistake,

    self.GP[i] = GPR(kernel=C(1.0, (1e-3, 1e3)) * Matern(length_scale=np.ones(self.x_dim), nu=2.5), 
                     n_restarts_optimizer=self.n_rest_opt)

where "length_scale=np.ones(self.x_dim)" indicates that the kernel function has independent length parameters for each individual feature.

ppgaluzio commented 4 years ago

I was thinking more of something like:

def __init__(self, ..., kernel=None):

    if kernel is None:
        self._kernel = Matern(nu) # current default
    else:
        self._kernel = kernel

    self.GP[i] = GPR(kernel=self._kernel)

so that the user can pass the kernel he wants or use the default, in this case, I think the kernel parameter could even be a list of kernel objects, in which case each objective could have a different kernel.

RodrigoAVargasHdz commented 4 years ago

Yeah that could work too! You will have to import all different types of kernels from the sklearn library.

RodrigoAVargasHdz commented 4 years ago

Hi, I saw that you updated the coda and include the possibility to use more robust kernels, Thanks :) I should define the kernel function using the standard notation from Sklearn, correct?

Again, thanks :)

ppgaluzio commented 4 years ago

Hi, yeah, I just didn't have time to test it yet, but it should work without a problem. You just define the any kernel from sklearn, the argument is passed directly to the GP, so any kernel implemented in sklearn is gonna work.