Open purva-thakre opened 1 month ago
Happy to go through this exercise, with the caveat that one could make the The Case Against 100% Code Coverage.
The caveat is understandable. I don't go by the coverage percentage. That value fluctuates inconsistently and it's still not clear to me how it is calculated.
Typically, I look at the uncovered lines in the pytest coverage report to make pytest ignore some lines from the coverage report or add unit tests. Perhaps we should discuss what type of unit tests should be added and which ones could be ignored.
For example, I don't think we need unit tests for mitiq/_about.py
but # pragma: no cover
is ignored by the pytest coverage report. So, we will have to create a PR to make pyest ignore these.
That value fluctuates inconsistently
It should no longer fluctuate, after https://github.com/unitaryfund/mitiq/pull/2319, which fixed https://github.com/unitaryfund/mitiq/issues/2318
Closely related to this issue, the following lines are ignored in the pytest coverage report. A unit test could be added to raise the error similar to some of the changes in #2366
Unless I am missing something and we can't raise this error?
Noticed some lines in the following files show up as uncovered by unit tests in the pytest coverage report. This issue is to figure out if these lines require additional unit tests or if they erroneously show up as uncovered in the pytest report.
mitiq/shadows/quantum_processing.py: L 140Ignored temporarily due to #2129mitiq/shadows/shadows.py: L 68, 73, 78, 137Ignored temporarily due to #2129mitiq/shadows/shadows_utils.py: L 61, 88, 91-96Ignored temporarily due to #2129Will update the issue description later with more details about the specific lines.