onsi / ginkgo

A Modern Testing Framework for Go
http://onsi.github.io/ginkgo/
MIT License
8.09k stars 643 forks source link

Detail Skipped reason when spec is skipped by cli arguments #1362

Open dschveninger opened 4 months ago

dschveninger commented 4 months ago

In the json produced by the ginkgo CLI run command --json-report is there a way to know why a test case state was skipped when the test case was spec from the following CLI arguments?

--label-filter [expression] If set, ginkgo will only run specs with labels that match the label-filter. The passed-in expression can include boolean operations (!, &&, ||, ','), groupings via '()', and regular expressions '/regexp/'. e.g. '(cat || dog) && !fruit' --focus [string] If set, ginkgo will only run specs that match this regular expression. Can be specified multiple times, values are ORed. --skip [string] If set, ginkgo will only run specs that do not match this regular expression. Can be specified multiple times, values are ORed. --focus-file [file (regexp) | file:line | file:lineA-lineB | file:line,line,line] If set, ginkgo will only run specs in matching files. Can be specified multiple times, values are ORed. --skip-file [file (regexp) | file:line | file:lineA-lineB | file:line,line,line] If set, ginkgo will skip specs in matching files. Can be specified multiple times, values are ORed.

dschveninger commented 4 months ago

Looking over the focus process it looks like the https://github.com/onsi/ginkgo/blob/master/internal/focus.go#L59 function is what sets the Spec skip Boolean. I was wondering if we could attach reason to the Spec or Node. Or would there be a better place in the architecture to perform something like this in order to get the reason to the SpecReport https://github.com/onsi/ginkgo/blob/881efde71ccc44d0d2ebc02f3641941e143a1c1d/types/types.go#L111

onsi commented 4 months ago

hey @dschveninger yes that would be the right place - and i would attach the reason to the spec. but can i step back and ask: what’s the problem you’re trying to solve? i understand you want the reason the spec was skipped… but why? if we start there we can explore whether or not there are alternative ways to solve the problem.

dschveninger commented 4 months ago

We are using ginkgo for our end 2 end testing of a large environment that contains many sub products and the suites are maintained by over 100 engineers. We have six different test suites that have different labels and different runners that allow focus to.be provided across 5 different corridors. Some runs of suites can have hundreds of test cases and the same amount of skips. The suite run 2 to 20 times a day.

At the end of the test suite we do a Skip analysis to ensure why test cases were skipped and provide a summary of the skip reasons. We would like to tell why the test case was skipped like which label was filtered out or focus not to run. This provides confidence to the organization that the right items where executed.

We also have a share function for a blockingworkitem that integrates with our task system and skip if the work item is still open or fails if the work item is closed.

We are also working on a full observability using OTEL of our suites through tracing, logs, and metrics. The more meta data that we have the more automation we can provide to communicate to different interested parties.

Hope this provide you with a look into what we are doing.

Also as always thank you for a great product that enable a large amount of engineers to communicate well through your our code and output with ginkgo.

onsi commented 4 months ago

wow - that’s a really interesting context you’re operating in. thanks for all the details @dschveninger - it sounds like you use the JSON output itself, correct? I think attaching the reason to the spec in the function in focus.go makes sense. For now we could simply have it be attached to the spec and emitted to JSON - no need to generate output for it on the command line. Would that work for your use-case? If so it’s an easier piece of work that would be largely confined to that file and some type definitions.

Is this something you or someone from your organization could work on? If not I can try to get to it but I am behind on writing Ginkgo code these days as I have other projects taking up my writing time.

dschveninger commented 4 months ago

Here is an outline of what I think I would do:

Add SkipReason to Spec in spec.go in internal Update focus.go to to define why the CLI will be skipping the spec by updating SkipReason in ApplyFocusToSpecs Update group.go evaluateSkipStatus to use the spec SkipReason from spec for State and failure testing Unit test spec_test.go methods has not changed so no new test. focus.go and group.go do not have unit test they will be handle by the following integration tests

Update the following integration_test filter_test.go

Can you let me any other changes or testing would need to be done. I will try to push a change this week.

onsi commented 4 months ago

hey @dschveninger that sounds like a good start but a few more thoughts:

Thanks - and if this becomes too burdensome please let me know!

Onsi