dfm / george

Fast and flexible Gaussian Process regression in Python
http://george.readthedocs.io
MIT License
451 stars 128 forks source link

Callable objects as the mean function - requiring parameters?? #122

Open tmcclintock opened 5 years ago

tmcclintock commented 5 years ago

According to the documentation the mean argument to the GP object can be either a scalar, a callable, or an object that follows the modeling protocol. However, this is slightly misleading because arbitrary callables cannot be used for the mean function as demonstrated in the following script (using Python 3.7 and george 0.3.1).

import george
import numpy as np
import scipy.optimize as op

x = np.linspace(0, 3*np.pi)
y = np.sin(x)

def mean_function(x):
    return np.zeros_like(x)

kernel = george.kernels.ExpSquaredKernel(1.)

####################################
#This GP builds with a scalar mean
####################################
gp1 = george.GP(kernel, mean=0)
gp1.compute(x)

def neg_ln_likelihood(p):
    gp1.set_parameter_vector(p)
    return -gp1.log_likelihood(y)
def grad_neg_ln_likelihood(p):
    gp1.set_parameter_vector(p)
    return -gp1.grad_log_likelihood(y)

result = op.minimize(neg_ln_likelihood, gp1.get_parameter_vector(),
                     jac=grad_neg_ln_likelihood)

####################################
#This GP works with a custom model
####################################
class MeanModel(george.modeling.Model):
    def get_value(self, x):
        return mean_function(x)

gp2 = george.GP(kernel, mean=MeanModel())
gp2.compute(x)

def neg_ln_likelihood(p):
    gp2.set_parameter_vector(p)
    return -gp2.log_likelihood(y)
def grad_neg_ln_likelihood(p):
    gp2.set_parameter_vector(p)
    return -gp2.grad_log_likelihood(y)

result = op.minimize(neg_ln_likelihood, gp2.get_parameter_vector(),
                     jac=grad_neg_ln_likelihood)

####################################
#This GP fails with a callable mean
####################################
gp3 = george.GP(kernel, mean=mean_function)
gp3.compute(x)

def neg_ln_likelihood(p):
    gp3.set_parameter_vector(p)
    return -gp3.log_likelihood(y)
def grad_neg_ln_likelihood(p):
    gp3.set_parameter_vector(p)
    return -gp3.grad_log_likelihood(y)

result = op.minimize(neg_ln_likelihood, gp3.get_parameter_vector(),
                     jac=grad_neg_ln_likelihood)

Note that this just reproduces the example from the docs.

In this script I create 3 GPs: one with a scalar mean, one with a custom modeling.Model object as the mean, and one with just a regular Python function as the mean. The problem arises when one calls GP.set_parameter_vector() for the GP with a Python function. The function only takes x as an argument but has no parameter_vector attribute (which is assumed to be there for anything following the modeling protocol).

I think either: 1) The docs are wrong and shouldn't say that an arbitrary callable can work or 2) The GP object should not check for parameter names if the mean function is a callable that doesn't subclass the Model object. I think this is reasonable because the parameters really only exist in the covariance, not in the mean.

Thoughts?