cornellius-gp / gpytorch

A highly efficient implementation of Gaussian Processes in PyTorch
MIT License
3.54k stars 556 forks source link

[Feature Request]: General Matern Kernels #1252

Open dhruvbalwada opened 4 years ago

dhruvbalwada commented 4 years ago

🚀 Feature Request

Have a general matern kernel as an option, in addition to the current version which takes in values of nu=1/2, 3/2 and 5/2.

Motivation

I am trying to use GP modeling to estimate spectral properties of (and sometimes also map) 2D or 3D scalar fields from scattered or patchy observations. My understanding is that the Matern kernel (or its combinations) is a good way to approach this problem (https://arxiv.org/abs/1605.01684). For example, a matern kernel with nu=1/2, 3/2 and 5/2 corresponds to a spectral slope of -2, -4 and -6 at high wavenumber. In oceanography, however, it common to find fields where the spectral slopes might be different from these (e.g -5/3 or -3 are more common) or are sometimes a combination (e.g. -3 at some intermediate wavenumber and -5/3 at high wavenumber).

I was wondering if there is a reason why a general kernel has not been implemented (maybe it is very hard to optimize for the value of nu)? or if it is just not there because most communities doing GP modeling don't have a use for it?

Additional context

I am very new to GPs, so any thoughts or guidance would be appreciated.

jacobrgardner commented 4 years ago

In general, the nu=integer+1/2 Matern kernels are a lot nicer to work with because they don't require us to evaluate fairly expensive functions like modified Bessel functions, as these kernels reduce to a product of an exponential and a polynomial. The worry is that they basically might become outrageously expensive to evaluate, so people in ML often tend to stick to the int+1/2 setting.

I'm in general not opposed to a general implementation, but it'd require having to implement both the forward and backward pass manually, since I doubt PyTorch supports some of the special functions involved natively.

dhruvbalwada commented 4 years ago

Thanks a lot for quick response. I am not familiar enough with the machinery to do the implementation myself. I will try to proceed with the int+1/2 kernels for now, and revisit when it becomes absolutely necessary to have other nus.

gpleiss commented 4 years ago

I think one challenge is that the general form of nu requires computing Bessel functions. There's not a native PyTorch way to do this, but I guess you could wrap some scipy functions. See https://discuss.pytorch.org/t/modified-bessel-function-of-order-0/18609/6

dhruvbalwada commented 4 years ago

Thanks, I will look into this.

shc443 commented 8 months ago

Hello! Currently gpytorch only accepts nu: float (0.5, 1.5, or 2.5) for the matern kernel . I am just wondering if you guys are planning to add more values for nu like 5.5, 15.5, etc

It would be computationally more expensive, of course, but I would love to start a PR for this if available :D

gpleiss commented 8 months ago

I'd be okay adding 3.5 and 4.5 to GPyTorch, but I'm worried that higher values or general values might be complex. I could be convinced though :)

Alternatively, we could create a special GeneralMaternKernel for non-standard nu values.

shc443 commented 8 months ago

I would be very interested in creating GeneralMaternKernel May I submit a PR about this?

gpleiss commented 8 months ago

Sure. Follow this PR as an example, and be sure to closely read our contribution guidelines.