Closed willow-ahrens closed 6 months ago
@nullplay what do you think of this? The approach here is to merge the runs at the end. I'd like to merge this PR and then reorganize the test suite a bit, but in the future we could also add a level flag to skip deduplication.
Attention: Patch coverage is 89.53488%
with 36 lines
in your changes are missing coverage. Please review.
Project coverage is 76.40%. Comparing base (
d1a59ee
) to head (c65cbbc
).
Files | Patch % | Lines |
---|---|---|
src/tensors/levels/denserlelevels.jl | 87.45% | 32 Missing :warning: |
src/tensors/levels/sparserlelevels.jl | 95.50% | 4 Missing :warning: |
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
This PR uses a lazy approach to merging identical runs. The steps are roughly:
It's not a particularly beautiful approach, but it allows us to automatically merge runs so that we can construct SparseRLE more seamlessly.
also, while the overhead of merging should be fairly minimal for
SparseRLE(Element(0.0))
, we may want to come up with a way to pass arguments tofreeze
so that we can skip deduplication. I'm open to suggestions, it could be a level parameter or a freeze parameter.Next step (in this PR or otherwise):
Do the same thing for RepeatRLE level, essentially copy-pasting whatever we decide here.