SheffieldML / GPy

Gaussian processes framework in python
BSD 3-Clause "New" or "Revised" License
2.04k stars 562 forks source link

How do I manually update hyperparameters for self-made kernels? #620

Closed yenicelik closed 6 years ago

yenicelik commented 6 years ago

I have tried posting this on stackoverflow, but it looks like GPy is not well represented there. https://stackoverflow.com/questions/49597043/how-to-set-hyperparameters-of-kernels-in-gpy

I am currently using GPy to build a custom kernel, that translates the input before it applies further operations on it. Sometimes, I need need to set the hyperparameters of the kernel that it encloses. In this case, self.inner_kernel is of type GPy.kern.src.stationary.Matern32.

My question is, how can I update the lengthscae and variance parameters for kernels in GPy? My current approach is the following, but it does not work. The inner_kernel still attains the old hyperparameters

    def set_s(self, s):
        assert(isinstance(s, float))
        self.s = s
        print("Updating s!")
        self.inner_kernel.lengthscale = Param("variance", np.asarray(s), Logexp())
        self.parameters_changed()
        self.inner_kernel.parameters_changed()

Any ideas or help would be greatly appreciated.

mzwiessele commented 6 years ago

Just set the lengthscale or any parameter with a float or numpy array, e.g:

k.lengthscale = 4.

On 1. Apr 2018, at 12:21, David notifications@github.com wrote:

I have tried posting this on stackoverflow, but it looks like GPy is not well represented there :D. https://stackoverflow.com/questions/49597043/how-to-set-hyperparameters-of-kernels-in-gpy

I am currently using GPy to build a custom kernel, that translates the input before it applies further operations on it. Sometimes, I need need to set the hyperparameters of the kernel that it encloses. In this case, self.inner_kernel is of type GPy.kern.src.stationary.Matern32.

My question is, how can I update the lengthscae and variance parameters for kernels in GPy? My current approach is the following, but it does not work. The inner_kernel still attains the old hyperparameters

def set_s(self, s):
    assert(isinstance(s, float))
    self.s = s
    print("Updating s!")
    self.inner_kernel.lengthscale = Param("variance", np.asarray(s), Logexp())
    self.parameters_changed()
    self.inner_kernel.parameters_changed()

Any ideas or help would be greatly appreciated.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.

yenicelik commented 6 years ago

So I have this function to update the parameters:

def set_s(self, s, safe=False):
    assert safe
    assert isinstance(s, float) or isinstance(s, Param), type(s)
    self.inner_kernel.variance = s
    self.s = Param('outerKernel.variance', self.inner_kernel.variance)
    self.link_parameters(self.s, self.inner_kernel)
    self.inner_kernel.parameters_changed()

but when I call

    gp_reg = GPRegression(self.X, self.Y.reshape(-1, 1), self.kernel, noise_var=sn)
    gp_reg.optimize("lbfgs")

then the parameters stay the same. Do I have to link them in a special manner?

To be more explicity with the structure I have, I have an object

TripathyMaternKernel(Kern):
    def __init__(self):
        self.inner_kernel = Matern32(
            input_dim=self.active_dim,
            variance=self.sample_variance() if variance is None else variance,
            lengthscale=self.sample_lengthscale() if lengthscale is None else lengthscale,
            ARD=True)

and I want to optimize over both the gaussian parameter (Gaussian.NoiseVar) AND the parameters of the inner kernel (inner_kernel.lengthscale and inner_kernel.variance)