Open JussiPekonen opened 2 years ago
Can you give us some feedback on this one @kward ?
One thing I forgot to mention in the original issue description is that should there ever be support for jUnit test reporting (see #16 and #127), this would make it possible to report the number of skipped test correctly in that report as well.
After rethinking this a bit, it would make sense to keep the current skipped asserts implementation and refine the generated report to indicate the number of skipped asserts more clearly. That's why I updated the title of the issue.
Should the skipped tests feature be implemented, I would suggest the following changes to be made:
startSkippingAsserts
, endSkippingAsserts
, and isSkippingAsserts
, that would ultimately replace the current *Skipping
functions. The old functions would be kept for backwards compatibility, but they would print out a warning message about deprecation. Tests and documentation would be updated accordingly.startSkippingTests
and endSkippingTests
(and possibly isSkippingTests
), would be added. When the start function is called, tests after that line would be skipped until the end of the file or when the end function is called, after which the tests would not skipped anymore.I am about to get these changes done. Some polishing is still needed before I can make a PR for the changes.
The proposal I had in the previous comment had to be tweaked a little. My suggestion for the changes would be the following:
startSkippingAsserts
, endSkippingAsserts
, and isSkippingAsserts
. The old functions *Skipping
are kept for backwards compatibility, but those will print out a warning message about the function being deprecated.skipTest
is introduced. It must be called as the very first thing in the function (before any asserts and fails) and it will disable the verification calls completely. Furthermore, it will not step the counters for the total, passed, failed, and skipped asserts. This function must be called for all tests that are wished to be skipped. In addition, it will print out a warning to stderr that the test is being skipped.Draft PR #161 created for these changes. Feel free to check it out.
The current
startSkipping
/endSkipping
functionality is all about skipping asserts within the test cases. While that solution can be used for skipping test cases completely, the counter for the "skips" counts the number of skipped asserts instead of skipped test cases. This can lead to situations where the reported number of skipped tests does not match the number of test cases that were skipped.For example, consider a test case
In the end of the generated report, there would be something like "OK/FAILED (skipped=X+2)" where X is the number of other assert skips. Should this test case be the only one to use skipping, the report could be interpreted to include 2 test cases that were skipped.
I would like to open a discussion on the topic. Do you see any benefits of having a skipping mechanism for individual test cases as well? If so, it would benefit from a separate counter that would be reported in the end of the test run. Furthermore, the current skipped counter could be re-used in the report to indicate how many asserts were skipped.
If this gets support, I can do the needed work to get this done.