JuliaGaussianProcesses / KernelFunctions.jl

Julia package for kernel functions for machine learning
https://juliagaussianprocesses.github.io/KernelFunctions.jl/stable/
MIT License
267 stars 32 forks source link

Fix `@test_broken` problems #546

Closed simsurace closed 6 months ago

simsurace commented 6 months ago

Fixes part of #526

devmotion commented 6 months ago

There are more bugs regarding @test_broken such as https://github.com/JuliaGaussianProcesses/KernelFunctions.jl/blob/f71cfe0bc5d2c54e927641f897823cf6baf91ac8/test/transform/selecttransform.jl#L107: @test_broken should only applied to expressions that evaluate to a Bool (which in the broken case is false but true if the problem is fixed).

codecov[bot] commented 6 months ago

Codecov Report

All modified and coverable lines are covered by tests :white_check_mark:

Comparison is base (e5696d7) 67.00% compared to head (6412453) 0.44%. Report is 1 commits behind head on master.

:exclamation: Current head 6412453 differs from pull request most recent head 50de544. Consider uploading reports for the commit 50de544 to get more accurate results

Additional details and impacted files ```diff @@ Coverage Diff @@ ## master #546 +/- ## ========================================== - Coverage 67.00% 0.44% -66.56% ========================================== Files 52 52 Lines 1373 1342 -31 ========================================== - Hits 920 6 -914 - Misses 453 1336 +883 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

simsurace commented 6 months ago

There are more bugs regarding @test_broken such as

https://github.com/JuliaGaussianProcesses/KernelFunctions.jl/blob/f71cfe0bc5d2c54e927641f897823cf6baf91ac8/test/transform/selecttransform.jl#L107 : @test_broken should only applied to expressions that evaluate to a Bool (which in the broken case is false but true if the problem is fixed).

I see. Why don't they show up as errors though? What is the test supposed to do anyway? It is just evaluating a gradient and assigning it to ga. What is broken about this?

devmotion commented 6 months ago

Probably it throws an error currently which is handled by @test_broken as well. So that case is currently fine - but it's still wrong since the tests will error as soon as the gradient computation does not throw an error anymore. See https://discourse.julialang.org/t/psa-when-tests-fail-due-to-improvements-in-type-inference/102001

simsurace commented 6 months ago

Would it be reasonable to replace these by an approximate equality comparing the chosen AD backend to :FiniteDiff?

devmotion commented 6 months ago

I guess these are the intended tests: https://github.com/JuliaGaussianProcesses/KernelFunctions.jl/blob/f71cfe0bc5d2c54e927641f897823cf6baf91ac8/test/transform/selecttransform.jl#L76-L82 So for the ReverseDiff we could just not create the intermediate variables and directly @test_broken the approximate equality of the two gradients?

simsurace commented 6 months ago

This should do it.