Closed YannCabanes closed 9 months ago
All modified lines are covered by tests :white_check_mark:
Comparison is base (
e7a177c
) 92.70% compared to head (74ef6c4
) 93.32%.:exclamation: Current head 74ef6c4 differs from pull request most recent head 7e228ee. Consider uploading reports for the commit 7e228ee to get more accurate results
:exclamation: Your organization needs to install the Codecov GitHub app to enable full functionality.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
In the file tslearn/metrics/soft_dtw_fast.py, the functions
_soft_dtw
,_soft_dtw_batch
,_soft_dtw_grad
and_soft_dtw_grad_batch
do not return a value, they modify directly the input value as a mutable object. An error was introduced in the PR #479 casting the input values to the selected backend with:Indeed,
be.array
cast the input values to the desired backend, but it creates a copy input values. Therefore the input values will not be modified as mutable objects. These lines are removed in this PR.We add a test to make sure that this error will not be reproduced.
We also add the option
compute_with_backend
in the functionssoft_dtw
,soft_dtw_alignment
,cdist_soft_dtw
andcdist_soft_dtw_normalized
. Before, the input data was cast to NumPy arrays to use Numba with the decorator@njit
and the results were converted to the backend of the input data. We can now use PyTorch automatic differentiation with these functions.