Closed rik-smeets closed 4 years ago
Cool ... this looks good - I like the code that is there.
I miss some code, though ... I like to make really sure that all test result providers are in sync, so for that reason I would like to see the same test cases for all the other providers as well. In the best case that means just making the changes you did in WhenParsingVsTestResultsFileWithIndividualResults.cs
to the corresponding files in the other providers. If we are lucky, the other providers handle this situation correctly. If not, more programming is needed. In any case this means extending the test harness.
Do you have the time and knowledge to do this? If yes, could you imagine doing this?
Did a quick check and failing are:
The other providers handle it correctly as is. Could be worse I guess ;-).
I'll take a look into it but I do have limited time. It doesn't help that I don't have experience with any of these test providers, so I'll first need to check how they generate the test cases and how the matching is performed now. I'll take a look at it this Wednesday and let you know how well it ends up.
If I conclude I can't fix it or it would take too much time for me, could we proceed with this PR in some other way? For example by creating issues for the aforementioned providers for these specific scenarios, so perhaps someone with more experience with that provider can fix it?
If you conclude you can't do all of it, we'll find some other way of continuing. The Cucumber and CucumberJS providers for example are always a pain - I won't hold it against you if you leave those alone ;-)
I got it working with both versions of XUnit - turned out they don't do inconclusive test results. I had to use the (already existing) valueForInconclusive
field in the StandardTestSuiteForScenarioOutlines
class instead of simply asserting TestResult.Inconclusive
.
I did not get SpecRun working. But even if I don't change anything, the generated results are different from the one in source control, causing some SpecRun tests to fail before even doing anything. Therefore I would like to propose to skip SpecRun for now and perhaps create a known issue for it?
It seems that the test / example group names are generated differently from SpecFlow 1.9.0 (which is used in the test harness) and 2.4.0 which I am using.
@jvandertil I cannot reproduce this. After regenerating the test results file with SpecFlow 2.4.0, all unit tests are still green. Example sets are still generated in the same way in my experience.
While doing this, we could also add scenarios to cover cases like in #554
@rik-smeets I'm taking a go at continuing this pull request. So far, I've done some housekeeping (merging, updating libraries, creating a task list).
I finally had some time to investigate this again. It seems SpecRun does not seem to support this situation. Too bad.
@rik-smeets I know you've been waiting this for a long time. Thank you for your patience!
Released in version v2.21.0.
Resolves #539 and resolves #543.
Shout-out to @jvandertil for doing part of this. @jvandertil, if you have some time, I'd appreciate if you'd have a look at it too!
Provide scenarios in the test harness solution
Add corresponding tests to the test framework unit tests in the main solution