JuliaGaussianProcesses / KernelFunctions.jl

Julia package for kernel functions for machine learning
https://juliagaussianprocesses.github.io/KernelFunctions.jl/stable/
MIT License
267 stars 32 forks source link

Designing some test fakes #464

Open willtebbutt opened 2 years ago

willtebbutt commented 2 years ago

We have decent enough interface tests in this package, but we don't have good choices for test fakes. See this blog for a description of what they are / why they're useful. The reason I think they're likely useful here is that if you're a package that consumes kernels, you currently have to pick a kernel + inputs to the kernel. Often, as for the tests involving a GP in AbstractGPs, it would be really useful if KernelFunctions just gave you a kernel and some collections of valid inputs for that kernel for which you could be confident that

  1. the kernel matrix produced is going to be positive definite (numerically)
  2. the kernel doesn't satisfy any more properties than the interface specifies. i.e. you probably want to be testing with a non-stationary kernel, that doesn't have unit variance etc to avoid accidentally depending on an edge case

Maybe we should publish in the TestUtils module some kind of weird composite kernel that we know isn't stationary etc, and that does play nicely with AD, in addition to some collections of inputs to the kernel that a downstream user can make use of.

Any thoughts as to whether this would be useful?

edit: link to blog post added