Closed rettigl closed 2 weeks ago
Totals | |
---|---|
Change from base Build 11435224433: | 0.03% |
Covered Lines: | 6500 |
Relevant Lines: | 7067 |
Just a general remark I was thinking about, might be good to separate the unit tests we have with integration tests, and that way we also know more clearly where our coverage is lacking.
With the separation of tests into tests for the submodules, and tests for the processor, this was already more or less my intentions. What specifically do you have in mind to change?
With the separation of tests into tests for the submodules, and tests for the processor, this was already more or less my intentions. What specifically do you have in mind to change?
Yes that's definitely true. But currently the integration tests (the tests for processor) show us that we have complete coverage. While this is true on a higher level, the individual components might not be unit tested then. So I was of thought that maybe the testing workflows for both should be split up.
Further, using actual data (such as a subset of WSe2) could also allow us to do a more comprehensive real-world test (end-to-end). I mean the building of documentation sort of does that, and if we place some assertions in e.g. a notebook, that'd sort that out.
Fix a problem with applying transformations from saved configuration without loading/generating a transformation slice, and add tests for this case.