JuliaGaussianProcesses / KernelFunctions.jl

Julia package for kernel functions for machine learning
https://juliagaussianprocesses.github.io/KernelFunctions.jl/stable/
MIT License
267 stars 32 forks source link

Add `KernelTensorSum` #507

Open martincornejo opened 1 year ago

martincornejo commented 1 year ago

Adressess https://github.com/JuliaGaussianProcesses/KernelFunctions.jl/issues/506. Implements a new kernel composition KernelTensorSum and related operator.

The naming should be discussed since the meaning of KernelTensorSum might not be appropriate. Suggestions are welcome.

codecov[bot] commented 1 year ago

Codecov Report

Patch coverage has no change and project coverage change: -29.38 :warning:

Comparison is base (ef6d459) 94.16% compared to head (598fbc7) 64.78%.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## master #507 +/- ## =========================================== - Coverage 94.16% 64.78% -29.38% =========================================== Files 52 53 +1 Lines 1387 1414 +27 =========================================== - Hits 1306 916 -390 - Misses 81 498 +417 ``` | [Impacted Files](https://app.codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/507?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses) | Coverage Δ | | |---|---|---| | [src/KernelFunctions.jl](https://app.codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/507?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses#diff-c3JjL0tlcm5lbEZ1bmN0aW9ucy5qbA==) | `100.00% <ø> (ø)` | | | [src/kernels/kerneltensorsum.jl](https://app.codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/507?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses#diff-c3JjL2tlcm5lbHMva2VybmVsdGVuc29yc3VtLmps) | `0.00% <0.00%> (ø)` | | | [src/kernels/overloads.jl](https://app.codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/507?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses#diff-c3JjL2tlcm5lbHMvb3ZlcmxvYWRzLmps) | `16.66% <ø> (-83.34%)` | :arrow_down: | ... and [28 files with indirect coverage changes](https://app.codecov.io/gh/JuliaGaussianProcesses/KernelFunctions.jl/pull/507/indirect-changes?src=pr&el=tree-more&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=JuliaGaussianProcesses)

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.

martincornejo commented 1 year ago

The failing tests introduced with Julia 1.9 are not related to the new Kernel. How should I proceed?

martincornejo commented 1 year ago

I've renamed KernelTensorSum to KernelIndependentSum since in my opinion it better informs what it is. I could also revert this commit, do you have any opinions on the naming?

martincornejo commented 1 year ago

Hi, is there anything holding this PR? Let me know if I should do anything more from my side. Thanks!

devmotion commented 1 year ago

I would have chosen KernelTensorSum, for consistency with KernelTensorProduct, similar to how we use the names KernelSum/KernelProduct. But I don't have a strong opinion.

willtebbutt commented 1 year ago

Sorry for not getting around to reviewing this.

I've renamed KernelTensorSum to KernelIndependentSum since in my opinion it better informs what it is. I could also revert this commit, do you have any opinions on the naming?

Could you elaborate on why this is a better name? This actually does worry me, because I value consistency with KernelTensorProduct.

martincornejo commented 1 year ago

I would have chosen KernelTensorSum, for consistency with KernelTensorProduct, similar to how we use the names KernelSum/KernelProduct. But I don't have a strong opinion.

Could you elaborate on why this is a better name? This actually does worry me, because I value consistency with KernelTensorProduct.

That was my initial idea as well, but without the context of KernelTensorProduct, it is difficult to understand what KernelTensorSum is in itself. Also, is the use of the term "tensor sum" used properly in this context? If not, we would be trading off consistency for "formal correctness".

But I'm completely open about this, both are valid alternatives. I'll leave the decision up to you.

willtebbutt commented 1 year ago

That's fair. I agree that I'm not entirely sure that the use of the term is correct, but I still think my preference is consistency + an issue suggesting that both be renamed when we next create a breaking change / a proposal to deprecate the tensor names.

theogf commented 1 year ago

I only found one reference to the term "Tensor sum" here: https://www.degruyter.com/document/doi/10.1515/spma-2019-0009/html?lang=en

devmotion commented 1 year ago

The term "tensor sum" is also used in this proposal: https://github.com/JuliaLang/julia/issues/13333 (even though arguably "direct sum" is more common - but not always synonymous: https://github.com/JuliaLang/julia/issues/13333#issuecomment-143825995)