Closed bmcfee closed 2 years ago
@bmcfee , I agree, the last check (if shape[axis] < 1: ...
) can be removed
Great, thanks for the quick response!
Merging #99 (f559573) into main (5641595) will increase coverage by
0.56%
. The diff coverage is81.81%
.
@@ Coverage Diff @@
## main #99 +/- ##
==========================================
+ Coverage 81.25% 81.81% +0.56%
==========================================
Files 4 4
Lines 96 99 +3
==========================================
+ Hits 78 81 +3
Misses 18 18
Flag | Coverage Δ | |
---|---|---|
unittests | 81.81% <81.81%> (+0.56%) |
:arrow_up: |
Flags with carried forward coverage won't be shown. Click here to find out more.
Impacted Files | Coverage Δ | |
---|---|---|
resampy/core.py | 82.00% <50.00%> (+1.23%) |
:arrow_up: |
resampy/filters.py | 78.57% <88.88%> (+0.19%) |
:arrow_up: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 5641595...f559573. Read the comment docs.
This PR fixes #97 and adds functionality to reset the cache. (This should never be necessary, but it can be helpful for debugging.)
In implementing this, I had to rewrite an in-place multiply of the filter coefficients https://github.com/bmcfee/resampy/blob/56415951d25a539a6fb6f0065dfcf1379f6a3c8a/resampy/core.py#L117-L120 with a copy-multiply. This will cost us a little when downsampling, but is free for upsampling. I don't think the cost is any more than what we were already paying for redundant loading from disk each time, so it should be a net win. In my informal benchmarks, I'm seeing improvements on the order of 10ms.
I've put in a few more tests to bring up the coverage while I'm at it. In doing so, I found what I think to be a dead code path introduced in #82 , and I'd like @avalentino to confirm:
https://github.com/bmcfee/resampy/blob/56415951d25a539a6fb6f0065dfcf1379f6a3c8a/resampy/core.py#L206-L219
Failing on this last check would require
len(t_out) < 1
(i.e.,len(t_out) == 0
), but this would fail earlier when trying to compute the min and max in the second check. (Numpy min or max of an empty array triggers a ValueError.) On the surface, we can safely remove this check, but I'm wondering if it's a bug and is intended to be checking something other than the domain check?