JuliaGaussianProcesses / KernelFunctions.jl

Julia package for kernel functions for machine learning
https://juliagaussianprocesses.github.io/KernelFunctions.jl/stable/
MIT License
267 stars 32 forks source link

Add lengthscale parameter to SquareExp Kernel #484

Closed schmidtjonathan closed 1 year ago

schmidtjonathan commented 1 year ago

Summary

This PR adds a lengthscale parameter to the SqExponentialKernel, making it more flexible.

[1] C. E. Rasmussen and C. K. I. Williams, Gaussian processes for machine learning. Cambridge, Mass: MIT Press, 2006.

Proposed changes

According to Eq. (4.9) in [1], the Squared-Exponential Kernel (a.k.a. Gaussian Kernel, etc.) has a characteristic lengthscale as a parameter, s.t.

image

, where r = d(x, y) is the distance between two points x and y. This PR adds this parameter, where lengthscale = 1.0 by default, recovering the previous state and thus making it compatible with applications already using it.

Breaking changes

None.

codecov[bot] commented 1 year ago

Codecov Report

Base: 94.32% // Head: 78.25% // Decreases project coverage by -16.07% :warning:

Coverage data is based on head (e866231) compared to base (eac3538). Patch coverage: 100.00% of modified lines in pull request are covered.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## master #484 +/- ## =========================================== - Coverage 94.32% 78.25% -16.08% =========================================== Files 52 52 Lines 1358 1352 -6 =========================================== - Hits 1281 1058 -223 - Misses 77 294 +217 ``` | [Impacted Files](https://codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/484?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses) | Coverage Δ | | |---|---|---| | [src/basekernels/exponential.jl](https://codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/484/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses#diff-c3JjL2Jhc2VrZXJuZWxzL2V4cG9uZW50aWFsLmps) | `100.00% <100.00%> (ø)` | | | [src/mokernels/lmm.jl](https://codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/484/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses#diff-c3JjL21va2VybmVscy9sbW0uamw=) | `0.00% <0.00%> (-100.00%)` | :arrow_down: | | [src/mokernels/slfm.jl](https://codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/484/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses#diff-c3JjL21va2VybmVscy9zbGZtLmps) | `0.00% <0.00%> (-100.00%)` | :arrow_down: | | [src/kernels/gibbskernel.jl](https://codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/484/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses#diff-c3JjL2tlcm5lbHMvZ2liYnNrZXJuZWwuamw=) | `0.00% <0.00%> (-88.89%)` | :arrow_down: | | [src/kernels/normalizedkernel.jl](https://codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/484/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses#diff-c3JjL2tlcm5lbHMvbm9ybWFsaXplZGtlcm5lbC5qbA==) | `0.00% <0.00%> (-82.50%)` | :arrow_down: | | [src/kernels/neuralkernelnetwork.jl](https://codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/484/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses#diff-c3JjL2tlcm5lbHMvbmV1cmFsa2VybmVsbmV0d29yay5qbA==) | `0.00% <0.00%> (-77.56%)` | :arrow_down: | | [src/kernels/overloads.jl](https://codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/484/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses#diff-c3JjL2tlcm5lbHMvb3ZlcmxvYWRzLmps) | `25.00% <0.00%> (-75.00%)` | :arrow_down: | | [src/mokernels/mokernel.jl](https://codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/484/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses#diff-c3JjL21va2VybmVscy9tb2tlcm5lbC5qbA==) | `50.00% <0.00%> (-50.00%)` | :arrow_down: | | [src/kernels/scaledkernel.jl](https://codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/484/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses#diff-c3JjL2tlcm5lbHMvc2NhbGVka2VybmVsLmps) | `41.17% <0.00%> (-47.06%)` | :arrow_down: | | [src/kernels/kernelproduct.jl](https://codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/484/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses#diff-c3JjL2tlcm5lbHMva2VybmVscHJvZHVjdC5qbA==) | `53.84% <0.00%> (-46.16%)` | :arrow_down: | | ... and [8 more](https://codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/484/diff?src=pr&el=tree-more&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses) | | Help us with your feedback. Take ten seconds to tell us [how you rate us](https://about.codecov.io/nps?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses). Have a feature suggestion? [Share it here.](https://app.codecov.io/gh/feedback/?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses)

:umbrella: View full report at Codecov.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.

willtebbutt commented 1 year ago

Hi @schmidtjonathan . Thanks for opening this PR. Have you had a chance to look at the docs regarding how we handle lengthscales in general in KernelFunctions? (I suspect you may have missed them 😄 )

schmidtjonathan commented 1 year ago

Hi @willtebbutt , thanks for your quick reply! Yes, I absolutely missed that, sorry. Thanks for the hint.

willtebbutt commented 1 year ago

Np! Enjoy using KernelFunctions!