Closed docian closed 6 months ago
The latest released version of TestNG is 7.0.0-beta1
. Please retry that.
If the problem persists, please help share a sample that can be used to reproduce the problem.
package co.testng.bug;
import org.testng.SkipException;
import org.testng.annotations.AfterMethod;
import org.testng.annotations.BeforeMethod;
import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;
import junit.framework.AssertionFailedError;
public class DemoBug {
@BeforeMethod(alwaysRun=true)
public void setup(Object[] o) {
String[] s = (String[])o[0];
System.out.println("printed from setup:"+s[0]);
if(s[0].equals("Billy")) throw new SkipException("doesnt matter");
}
@Test(dataProvider="dataIssuer")
public void test(String[] arg) {
System.out.println("printed from test:"+arg[0]);
}
@AfterMethod(alwaysRun=true)
public void teardown(){
System.out.println("---teardown---");
}
@DataProvider(name="dataIssuer")
public Object[][] issueData(){
String[][] data = {{"John"},{"Billy"},{"Jimmy"}};
return data;
}
}
that's the simplest code that shows the issue
TestNG is skiping the third running of the test as well
The javadocs for alwaysRun
says that when it's used it will ignore group filtering and run always. But in your case you are not using any groups.
Not quite sure I understand what is the expectation here.
See here : http://static.javadoc.io/org.testng/testng/7.0.0-beta1/org/testng/annotations/BeforeMethod.html#alwaysRun--
Is there a chance that you have misinterpreted the functionality of alwaysRun
attribute ?
It's used to tell TestNG that "always run a particular configuration method irrespective of what group I chose to run".
doesn't matter. the fact is that the test methods are not running after a SkipExeption in @BeforeMethod and this is not a desired behavior since the test should be skipped only for that specific data set and the check is performed in the before method.
It does matter. You are explicitly causing a test to be skipped by throwing an exception from its configuration method.
Data driven tests is not "n" tests. It's just one test running with "n" sets of data.
So I don't see why TestNG should try and execute the next iteration when a config method failed for a previous iteration.
AFAIK TestNG is working as designed here.
You can try setting configfailurepolicy
to continue
in your suite XML and check if that helps.
no... still not running. output is here: printed from setup:John printed from test:John ---teardown--- printed from setup:Billy ---teardown--- printed from setup:Jimmy ---teardown---
it skips the third execution of the test as well
Then in that case, TestNG is working just as I had explained. You cannot have TestNG retry the next iteration of a data driven test method even though the previous iteration was skipped due to a configuration failure.
hmmm... the configuration failure refers at a specific configuration per method. it could fail for one set of data but not for all. in this case it loses the power of data driven framework since each test execution could be a separate test case as in this case.
yes, it's working as you explained but not as it's useful for a test framework. everything has an explanation but not every behavior is a desired one!
@docian - Thats one way of looking at it. I dont think @BeforeMethod
as a configuration method was created to keep the data driven test method's iteration in mind. Logically if you look at it, if a configuration method fails, then all the test methods that depend on the configuration method should be skipped. The configuration method is not meant to introspect the test method's test data and take actions based on that, but instead it is meant to just do setup for you.
I believe what you want here is an IHookable
implementation.
Here's a sample that shows what I am talking about
import org.testng.IHookCallBack;
import org.testng.IHookable;
import org.testng.ITestResult;
import org.testng.SkipException;
import org.testng.annotations.AfterMethod;
import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;
public class DemoBug implements IHookable {
@Test(dataProvider = "dataIssuer")
public void test(String arg) {
System.out.println("printed from test:" + arg);
}
@AfterMethod(alwaysRun = true)
public void teardown() {
System.out.println("---teardown---");
}
@DataProvider(name = "dataIssuer")
public Object[][] issueData() {
return new Object[][] {{"John"}, {"Billy"}, {"Jimmy"}};
}
@Override
public void run(IHookCallBack callBack, ITestResult testResult) {
Object[] parameters = callBack.getParameters();
if (parameters[0].toString().equalsIgnoreCase("Billy")) {
testResult.setThrowable(new SkipException("Causing skip for " + parameters[0]));
testResult.setStatus(ITestResult.SKIP);
} else {
callBack.runTestMethod(testResult);
}
}
}
Here's the output (I am using TestNG 7.0.0-beta1
which is the latest released version as of today)
printed from test:John
---teardown---
Test ignored.
---teardown---
printed from test:Jimmy
---teardown---
===============================================
Default Suite
Total tests run: 3, Passes: 2, Failures: 0, Skips: 1
===============================================
Process finished with exit code 0
This is a bit hard to explain. TestNG doesn't create multiple instance of the class DemoBug for each parameter group. It just run the test method by passing different parameter group with the same instance of the class DemoBug. However, it will call @BeforeMethod for each parameter group. And further more, as long as one calling of @BeforeMethod is failed, then it will mark that the configuration calling is failed and the execution of the test method (here, it's test(String arg)) is skipped - that's a bit not reasonable and should be fixed. However, it's not related to the attribute alwaysRun=true.
@wengm
that's a bit not reasonable and should be fixed.
Do you want to help me understand why you feel its not reasonable and should be fixed? Why do you expect this configuration method to retry every attempt of the same test (Internally how TestNG manages a data driven test is not something which IMO a user needs to be worried about) but for a different parameter ?
Due to bad comment editor, I create a wiki page for explanation. Please go to https://github.com/cbeust/testng/wiki/Issue1991:-DataProvider-Execution for details.
It's still not clear for me as to what is the expectations here. Can you please elaborate.
On a side note I noticed that you aren't using 7.0.0-beta1
the latest released version as of today.
AFAIK if a @BeforeMethod
fails then TestNG will k
Skip a method. If it's a data driven test then TestNG will skip the rest of its iterations.
@juherr @cbeust WDYT ?
thank you, it solved the problem.
Closing this issue based on the OPs inputs.
@krmahadevan
@BeforeMethod
for them. Otherwise, it's a waste to run @BeforeMethod for iterations from x+1. Furthermore, it will result in consusion - @BeforeMethod may be successful for iterations from x+1 but the tests are skipped due to failure of @BeforeMethod of iteration x.BTW, what's the criteria of the tickets closing? As long as the user's problem is solved by any workaround or all problems and confusions are clarified?
yes, i agree that it is a workaround but I haven't enough time to explain what is obvious.
@wengm I have tried my best to explain. Looks like you still believe this to be an issue.
Anyway I will reopen this issue and let either @cbeust or @juherr respond
@wengm please help submit a test that can be run to simulate the problem ( preferably with assertions instead of print statements). Also please make sure you are using 7.0.0-beta1
(latest released version) and try with 7.0.0-SNAPSHOT (upcoming release candidate)
If I understand well the issue, a before method run should be linked to the test method run. It looks to be the more logic behavior.
But at the same time, TestNG, by design, use the same instance for each run and allows users to share a context from one run to others. So, a failing configuration method may compromise the next test runs and may produce wrong results.
Maybe a configuration for the behavior could be a good compromise but in my opinion, the current implementation is the best one for the moment.
I propose to keep the issue open.
I've just noticed the issue is talking about skip tests and not failing configuration methods.
Maybe the behavior can be different between failing and skip. WDYT @krmahadevan ?
@juherr - IMO there is no issue here, which is what I have tried explaining before :)
I am not entirely sure as to why should a @BeforeMethod
configuration method be executed for the next iteration of a test method. Moreover there are ways of getting this done through other means within TestNG. So I dont think we need to do anything on this issue.
I think you should make a sharp distinction between "failed" and "skipped" status of a test. That's why I'm saying that a skipped test shouldn't drive to a skipping of the following tests.
Yes, I agree with that. I don't find a use case where we'd like to skip all methods after the first skipped method. And I wonder if configuration methods are not the best place for the skip logic (instead of having it in the test method itself)
The before method is the best place to skip a test when the test has a bug "hanged" on it. I'm integrating the TestNG with DevOps Azure VSTS and at the beginning I'm making such a check. If the test has an Active or New bug "hanged" on it I have to skip the test (it doesn't make sense to execute it). In this way I can push the limits of the QA automation process and to minimize the manual operation. The benefits are multiple.
@docian
I think you should make a sharp distinction between "failed" and "skipped" status of a test. That's why I'm saying that a skipped test shouldn't drive to a skipping of the following tests.
If we go back to the example that you shared in https://github.com/cbeust/testng/issues/1991#issuecomment-450557019
you would notice that a configuration method is throwing an exception, which means it's failing. So the test method is getting skipped.
Its not the skipped status of a test method that is causing the rest of the iterations to be skipped, but here the rest of the iterations of a test method are being skipped because a configuration failed.
A test method can get skipped due to 2 reasons.
So there's a very clear distinction that TestNG is already doing.
What you are asking for is that TestNG, tries running the remaining iterations of a test even though a particular iteration was skipped due to a configuration failure.
What I was basically trying to say was that TestNG does not retry running failed configuration methods. That has been the basic behavior till now.
The question now boils down to : Should TestNG support retrying a failed configuration method @BeforeMethod
in this case, after there was a failure of the same for a previous iteration.
The reason I say this is because, if TestNG were to attempt to continue running the next iteration, then it would need to run the @BeforeMethod
once again (Even though it had already failed during a previous iteration).
Hope that clears out what I have been trying to say.
And that is why I also felt (just as @juherr calls out in his comment in
https://github.com/cbeust/testng/issues/1991#issuecomment-450736667) that perhaps for your use case, you should be doing it within a @Test
method itself, or perhaps from a higher level of abstraction such as via a IHookable
implementation (Which is what I shared in my sample).
The before method is the best place to skip a test when the test has a bug "hanged" on it
Where is this tag coming from? If its from a data provider, then I would look at it as just meta data that helps in making some decision. And that decision needn't be from a configuration method (because you aren't setting up anything per se via your configuration method) but merely are looking for a mechanism to intercept the data with which a test is going to be executed and do something prior to the test method actually consuming that data/meta data. I believe that IHookable
is certainly a good use case for this.
Here's another sample that does the same thing but makes use of a TestNG listener (which is also another way of abstracting out the decision of when to run a test and when to skip it)
package com.rationaleemotions.github.issue1991;
import com.rationaleemotions.github.issue1991.Demo2.DecisionMaker;
import org.testng.IInvokedMethod;
import org.testng.IInvokedMethodListener;
import org.testng.ITestResult;
import org.testng.SkipException;
import org.testng.annotations.AfterMethod;
import org.testng.annotations.DataProvider;
import org.testng.annotations.Listeners;
import org.testng.annotations.Test;
@Listeners(DecisionMaker.class)
public class Demo2 {
@Test(dataProvider = "dataIssuer")
public void test(String arg) {
System.out.println("printed from test:" + arg);
}
@AfterMethod(alwaysRun = true)
public void teardown() {
System.out.println("---teardown---");
}
@DataProvider(name = "dataIssuer")
public Object[][] issueData() {
return new Object[][] {{"John"}, {"Billy"}, {"Jimmy"}};
}
public static class DecisionMaker implements IInvokedMethodListener {
@Override
public void beforeInvocation(IInvokedMethod method, ITestResult testResult) {
if (method.isConfigurationMethod()) {
return;
}
Object[] parameters = testResult.getParameters();
if (parameters[0].toString().equalsIgnoreCase("Billy")) {
testResult.setThrowable(new SkipException("Causing skip for " + parameters[0]));
testResult.setStatus(ITestResult.SKIP);
}
}
}
}
And here's the output for the same
printed from test:John
---teardown---
printed from test:Billy
Test ignored.
---teardown---
printed from test:Jimmy
---teardown---
===============================================
Default Suite
Total tests run: 3, Passes: 2, Failures: 0, Skips: 1
===============================================
Process finished with exit code 0
Listeners give you a much more control in terms of separating out the concerns, and also gives you the luxury of wire on, wire off (Listeners can be wire in at will using the <listeners>
tag in the suite file)
I've tried to show you a valid use case. From my point of view I can find many workarounds but I've tried to find out whether there is a way to fulfill my goals. When I'm throwing a SkipException I'm waiting to find it in the reports as well. That's why I am doing this way. Of course I can build a lot around the TestNG library but this is not the idea.
The reason I say this is because, if TestNG were to attempt to continue running the next iteration, then it would need to run the @BeforeMethod once again (Even though it had already failed during a previous iteration).
Actually, TestNG does run the @BeforeMethod for each iteration no matter in which iteration it's failed due to any reason. I already tested this in both 7.0.0-beta1 and 6.14.3. Of course, it's a waste because the test for the remaining iterations will be skipped due to the failure happened in previous iteration. However, because data can be injected into @BeforeMethod, it could work differently in different iterations for different data group - for some data group, it may fail and for others it may pass. Thus, the failure in one iteration should not impact other iterations.
I already summarized in the wiki page:Issue1991:-DataProvider-Execution:
@BeforeMethod
is executed for each test run for each data group (current implementation), then the test run should be atomic and @BeforeMethod
of one test run should not impact the other test runs.@BeforeMethod
should be executed before test method rather than each test run for each data group when data provider is used. If this is applied, then it's reasonable to skip the whole test method execution when @BeforeMethod
is failed.@krmahadevan Your workaround is right and listeners are often a recommended solution. But it should be possible to write the same thing by annotation. And SkipException should have a specific treatment everywhere possible.
If @BeforeMethod is executed for each test run for each data group (current implementation), then the test run should be atomic and @BeforeMethod of one test run should not impact the other test runs.
I've built my test suite for this direction. @BeforeMethod is used to setup objects while the injected object[] contains parameters for test logic. @AfterMethod is responsible to dispose all objects which are instantiated at @BeforeMethod.
Though all parameterized tests share the same flow. My team consider them as separate tests.
I have an appium test suite. I have many tests that have some common steps(interaction with the app). So, I have a BeforeMethod annotated method that contains those common steps. Now what happened is, for 1st test method when BeforeMethod ran. An element did not show up in time, so that test failed. But when BeforeMethod ran for second test. It ran fine but subsequent methods skipped because BeforeMethod failed 1st time. If BeforeMethod is running for every test, the behaviour for every test should be different.
hi has this been solved is a latest update
TestNG version 7.4.0
i’m having a similar yet simpler issue
i the input a throw skip exemption in a @Beformethod
tag and eg
if method name = test1 skip
then i have 2 methods that are both tests
@test
public test1(){
}
@test
public test2(){
}
on run it will throw the skip but then every subsequent function will also throw the skip
i would think the expected outcome would pick back up in the @Aftermethod
of test1 then rerun the @beformethod for the test2 and continue not continue to skip
when i run a class that uses the listener
I have revisited this issue using TestNG 7.10.2
(Latest released version as of today) using the below sample. I am not able to reproduce the issue. Closing this issue. Please comment with an altered/tweaked version of the sample used here that can be used to recreate the issue, if this is still a problem.
Setting configfailurepolicy="continue"
causes TestNG to retry the config method for the next iteration
TestNG Version
6.14.3
Expected behavior
If one BeforeMethod skips (and Test skips as well) I'm expecting that for the next cycle (alwaysRun=true) if the Conditional method runs well the test to be performed with a new data row. The test method has an associated DataProvider.
Actual behavior
If one BeforeMethod skips nothing is executed from now on as Tests annotated method(s).
Is the issue reproductible on runner?
Test case sample