Closed alpha1592 closed 4 years ago
We currently have the same issue. We have worked around this, by tagging scenarios/specs with "omit" and changing all of our pipeline run commands to include --tags \! omit
. This makes Gauge not run a given spec/scenario with this tag. Some proper support for this would be nice though, as this means that we can't trigger any hooks with these, as fundamentally they are not run.
We have after_scenario
hooks that update a reporting server, however these are then missed. Our workaround is to use before_suite
and grep across all specs and do some funky stuff to workout what will be skipped in advance.
If you define steps with no implementation, Gauge sometimes fails execution even when running a specific test scenario which does have an implementation.
That's strange. Gauge does not fail scenarios with unimplemented steps it skips them. For e.g. in a sample project
$ gauge init js_simple
And add an unimplemented step in the existing example.spec
file
# Specification Heading
* Vowels in English language are "aeiou".
## Vowel counts in single word
tags: single word
* Skip this scenario till we fix a few issues
* The word "gauge" has "3" vowels.
## Vowel counts in multiple word
This is the second scenario in this specification
Here's a step that takes a table
* Almost all words have vowels
|Word |Vowel Count|
|------|-----------|
|Gauge |3 |
|Mingle|2 |
|Snap |1 |
|GoCD |1 |
|Rhythm|0 |
And run the specs
$ gauge run specs --hide-suggestions
[ValidationError] /Users/zabilcm/Projects/vowels/specs/example.spec:17 Step implementation not found => 'Skip this scenario till we fix a few issues'
# Specification Heading
## Vowel counts in multiple word ✔ ✔
Successfully generated html-report to => /Users/zabilcm/Projects/vowels/reports/html-report/index.html
Specifications: 1 executed 1 passed 0 failed 0 skipped
Scenarios: 1 executed 1 passed 0 failed 1 skipped
Notice that Gauge skips the scenario Vowel counts in single word
because it has the unimplemented step Skip this scenario till we fix a few issues
Running the scenario by name also skips it
$ gauge run --scenario="Vowel counts in single word" --hide-suggestion
[ValidationError] /Users/zabilcm/Projects/vowels/specs/example.spec:17 Step implementation not found => 'Skip this scenario till we fix a few issues'
Successfully generated html-report to => /Users/zabilcm/Projects/vowels/reports/html-report/index.html
Specifications: 0 executed 0 passed 0 failed 1 skipped
Scenarios: 0 executed 0 passed 0 failed 1 skipped
So, instead of tags try using a descriptive but unimplemented step in a scenario to skip it's execution.
Thanks for the feedback.
This is a workable solution for skipping test scenarios, although it does require you to add an unimplemented test step to each scenario you want to have skipped (same amount of work as adding a tag i guess). The downside to this is that we would have Specs that show errors in IntelliJ/VSCode (red lines under spec filename etc.) and potentially cause some confusion.
However, how would you handle skipping an entire spec? If you had quite a few (100s) scenarios, you would have to go and modify potentially 100s of scenarios. Tags at spec level would make that very simple to handle.
Thoughts?
If you had quite a few (100s) scenarios, you would have to go and modify potentially 100s of scenarios. Tags at spec level would make that very simple to handle.
Agree.
What if there's a feature to include all scenarios in the report? The one's that are not executed or filtered out will be marked as not executed in the report. Or atleast the specs or scenarios filtered out with !
will be listed in the report. for e.g. gauge run --tags "tag1 & !pending" will list scenarios matching tag1 and
pending` (marked as not executed) in the report.
gauge run --tags "tag1 & !pending"
This would work in principal and when running tests thru command line or part of CI/CD etc. However, testers spend a lot of time developing tests in IDEs like IntelliJ and/or VSCode. Both of these have plugins that allow you to run a tests that you are working on easily. If you were developing tests and executing them within IntelliJ, you would have to setup the configuration to run with tags like !pending
. This may present some confusion with usability. With multiple testers, it may present a challenge and/or added overhead to have that setup for every tester and various testing project. It would be easier if we had control of the execution cycle thru code or some other predefined config.
Are we not able to get more control over Execution Cycle to allow Skipping tests on demand, either by throwing a specific exception or thru an explicit call?
Throw an exception anytime you want to skip...
throw new GaugeSkipTestException("Reason for skipping...");
or Call Gauge Java class with static method...
Gauge.skipScenario("Reason for skipping...."); // gauge-java
Gauge.skipSpecification("Reason for skipping...."); // gauge-java
or, pick a tag to skip by using property files...
# /env/default/default.properties
# will use `pending` tag to skip tests...
gauge.tags.skipTests=pending
@alpha1592 If you want to share the run configuration with other team members, it is possible. There is no need for all your team members to set it up manually. In IntelliJ see https://intellij-support.jetbrains.com/hc/en-us/community/posts/360000142684-Share-Run-Configuration-with-code on how you can share the configurations.
Gauge controls the execution lifecycle of the specs. So adding this kind of property in particular language is not easy and would now require the language runner to have control of the execution lifecycle. This is currently not supported by the Gauge architecture.
Also, I can think of another way to handle this. What if we had a property in Gauge which would define the tags that need to specified during Gauge execution. This property can be present in the environment files and will be overridden if --tags
is passed to Gauge. Will this help solve your problems?
What if we had a property in Gauge which would define the tags that need to specified during Gauge execution.
Not sure i understand what you mean... Are you saying that we would specify a tag that needs to appear at each scenario/spec in order to the test to be executed? if so, it may make gauge too rigid and lose some of its flexibility.
As a work around, noted above, if you add an unimplemented step to a scenario, it skips the scenario. I found that If you add an unimplemented Context Step as the first step, it skips the entire Spec. This works for now, although i would have like more control over this. Right now, if it skips using this work around, the reason given in the report is that there is an unimplemented step, which is somewhat misleading because i may have wanted to skip the test on purpose for a variety of reasons. For now, i think we can close this issue because we are finding workarounds, and one already exists. I wanted to try and get a simpler and cleaner solution to this, but i guess we can revisit this a later time. Thanks for your effort and time.
# Web Services Sample Tests
tags: tag1, web-services, sample, soap, rest
Provide a description for this test specification
* Skip this entire Specification because Application under test is not ready
* This step will run before every scenario.
## SOAP Web Service Example
tags: soap, temperature, celsius
Provide a description for this test scenario
* Load template "soap_request".
* Get test data from JSON Test data file.
* Build Soap Request.
* Send message to SOAP Service.
* Verify that the HTTP Response Code is "200".
This results in...
Skipped: Skip this entire Specification because Application under test is not ready
Filename: specs\ExampleFeature.spec
Message: Step implementation not found
@alpha1592 Good to know that you found a workaround. But I wasn't talking about adding a tag to all tests that need to be executed, but viceversa, i.e., add a tag to all specs that need to be skipped. You can then specify the tag expression in the property file(specify tags to be skipped), so that you developers need not do any setup. It will be done once and tags can exist in the property file. The only effort after the initial setup will be to add the appropriate tags to any new specifications created. Your tags can also be descriptive, so you will know the reason for skipping them.
for example, if you want to skip test because of flakiness and because feature is not implemented, you can add tags like flaky
and unimplemented
.
The appropriate tags can be added to your specifications.
You then specify a propertly like gauge_tags = !flaky & ! unimplemented
in you default.properties file (or environment file).
Then everytime Gauge is run, the tags will be applied, there is not need to specify them using the commandline flags.
If the gauge_tags
approach is something to be considered, I reckon it is a small improvement over something like having a run_tests.sh
which could be a wrapper around gauge run
.
I suppose the key point here is whether the skipped scenarios need to be included in the report. If such is the case, then there needs to be some thought on how to get this implemented. The tag name alone may not always be enough to list the reason for skipping the spec/scenario.
If reporting the skipped specs/scenario is not necessary (as is the behaviour today), then I reckon the execution can be controlled by using tags and tag expression.
@nehashri I like the idea of gauge_tags
in the property file. If we can have that, it would be wonderful. However, as @sriv mentioned, the question is whether the results should show up in the report or not. In my opinion, it is very important to have the skipped tests included in the report with a proper/hopefully user defined reason as to why certain tests were skipped.
I'm happy to see I'm not the only one with this need, and kinda sad that the only way to list skipped scenarios is to grep the spec file. How can we help move this forward?
Closing this as an old issue. However, any PR's to fix this will be merged.
Hi ,
I too came up with this requirement. Lets say i have a tag Android. Now in the BeforeSpec hook, i will get devicetype at run time and skip the spec execution deliberately if device type in iOS. DO we have any feature to skip the spec execution deliberately(say any static method/ exception to skip)
DO we have any feature to skip the spec execution deliberately(say any static method/ exception to skip)
Currently, you can't skip execution at run time.
is it possible to provide this feature in Gauge to skip the execution by spec level like Gauge.writeMessage(""), is it possible to give Gauge.skipSpec, Gauge.skipScenario. SO when this is used, all the steps in this spec should be skipped and same be reflected in report.
I could use these methods inside BeforeSpec hooks, to determine whether this spec to run or to skip
I am asking this again as i am need of it. And i also see several other folks are also in need of this feature
gauge run --tags "tag1 & !pending"
This would work in principal and when running tests thru command line or part of CI/CD etc. However, testers spend a lot of time developing tests in IDEs like IntelliJ and/or VSCode. Both of these have plugins that allow you to run a tests that you are working on easily. If you were developing tests and executing them within IntelliJ, you would have to setup the configuration to run with tags like
!pending
. This may present some confusion with usability. With multiple testers, it may present a challenge and/or added overhead to have that setup for every tester and various testing project. It would be easier if we had control of the execution cycle thru code or some other predefined config.Are we not able to get more control over Execution Cycle to allow Skipping tests on demand, either by throwing a specific exception or thru an explicit call?
Throw an exception anytime you want to skip...
throw new GaugeSkipTestException("Reason for skipping...");
or Call Gauge Java class with static method...
Gauge.skipScenario("Reason for skipping...."); // gauge-java Gauge.skipSpecification("Reason for skipping...."); // gauge-java
or, pick a tag to skip by using property files...
# /env/default/default.properties # will use `pending` tag to skip tests... gauge.tags.skipTests=pending
Hey, is this skip java methods deprecated or has their usage changed?
Hey, is this skip java methods deprecated or has their usage changed?
These methods weren't implemented. It was just a suggestion.
Is this on the horizon in the future? This seems necessary for us as well....
bnump
Expected behavior
There are some cases, either due to known defects/issues or code instability, we need to be able to skip certain test scenarios or test specs altogether. It would be nice, if we could trigger some event that can skip an entire test specification or a particular test scenario on-demand.
Another use case is where a tester or BA person will create the test specifications with test steps, but there is no code implementation yet for those steps. The specs are created for documentation only or to track test cases. This would happen early on in the project.
If you define steps with no implementation, Gauge sometimes fails execution even when running a specific test scenario which does have an implementation.
I understand skipping tests on purpose may not be the best practice for testing, however, there is a definite need for this feature.
Actual behavior
This feature is not supported yet.
Possible Solution
Use the existing
tags
functionality to allow for skipping of tests. This allows for controlling at the test spec and scenario level. All we would need is a special tag that would allow testers to create/define a test, but not have it included in the overall test execution. It would still be listed in the metrics as Skipped rather than failed in the reports.in this example,
:skip
would trigger the scenario or spec execution to be skipped and reported in the official report as skipped. Currently, there is a work around that i am using, however it results in Test scenario being reported as a failure. I am using@BeforeScenario
annotation in Java to set this up.