JuliaGaussianProcesses / KernelFunctions.jl

Julia package for kernel functions for machine learning
https://juliagaussianprocesses.github.io/KernelFunctions.jl/stable/
MIT License
267 stars 32 forks source link

`WendlandKernel`: Compact support -> sparse kernel matrix #550

Open timweiland opened 6 months ago

timweiland commented 6 months ago

Hey 👋 I think Wendland kernels are cool, and I would like to contribute a PR. Let me know what you think :)

What's this about?

Wendland kernels have compact support. If two points are "too far away" (quantified by the lengthscale of the kernel), their covariance under a Wendland kernel is zero. This results in sparse kernel matrices, which can be leveraged to save memory and compute. In particular, Julia has built-in support for highly optimised sparse Cholesky decompositions.

Within their support, the Wendland functions are defined by a rational polynomial, the coefficients of which can be computed in closed form.

At the same time, Wendland kernels have nice theoretical properties. In particular, much like the Matérn kernels, their smoothness is controllable directly through a smoothness parameter.

For more details, and in particular for the definition of Wendland functions, refer to Chapter 9 of Scattered Data Approximation by Holger Wendland.

What do I propose?

I would like to add the WendlandKernel to KernelFunctions.jl. I already have an implementation locally for Wendland kernels with arbitrary space dimension $d$ and smoothness parameter $k$. It produces sparse kernel matrices.

Difficulties

Currently, AbstractGPs.jl does not work with sparse Cholesky factorizations. I added some extensions locally, and I am not sure if these should go to KernelFunctions.jl or if I should make a separate PR for AbstractGPs.jl.