Open nagkumar opened 1 year ago
You should not need all this template code anymore with version 1.30.0
You can look at the test suite example code to see how to configure suites with 1.30.0+
Wow, 1.30.x approach is far more similar.. thank you for making it better.. Now I have the code as
package com.tejasoft.tests.ju.ju5.ut.perf.noconnor;
import com.github.noconnor.junitperf.JUnitPerfReportingConfig;
import com.github.noconnor.junitperf.JUnitPerfTest;
import com.github.noconnor.junitperf.JUnitPerfTestActiveConfig;
import com.github.noconnor.junitperf.JUnitPerfTestRequirement;
import org.junit.platform.suite.api.ConfigurationParameter;
import org.junit.platform.suite.api.SelectPackages;
import org.junit.platform.suite.api.Suite;
import static com.tejasoft.tests.ju.ju5.ut.perf.noconnor.utils.ReportingUtils.newHtmlReporter;
@Suite
@SelectPackages("com")
@JUnitPerfTest(totalExecutions = 100)
@JUnitPerfTestRequirement(allowedErrorPercentage = 0.01F)
@ConfigurationParameter(key = "junit.jupiter.extensions.autodetection.enabled", value = "true")
public final class TestPerfNCRSuite
{
@JUnitPerfTestActiveConfig
public static JUnitPerfReportingConfig config =
JUnitPerfReportingConfig.builder().reportGenerator(newHtmlReporter("ncr_perf_suite_report.html"))
.build();
}
While this suite is run, even other suites are included that does not have JUnit Perf Annotations.. hence it gives the following error and stops
com.tejasoft.edu.dsa.bbraces.tests.ju.ju5.ut.suites.TestBBSuites
X initializationError
org.junit.platform.suite.engine.NoTestsDiscoveredException: Suite [com.tejasoft.edu.dsa.bbraces.tests.ju.ju5.ut.suites.TestBBSuites] did not discover any tests
com.tejasoft.tests.ju.ju5.ut.suites.learn.TestSuiteEx
X initializationError
org.junit.platform.suite.engine.NoTestsDiscoveredException: Suite [com.tejasoft.tests.ju.ju5.ut.suites.learn.TestSuiteEx] did not discover any tests
package com.tejasoft.edu.dsa.bbraces.tests.ju.ju5.ut.suites;
import com.tejasoft.edu.dsa.bbraces.tests.ju.ju5.ut.TestBBOrdered;
import com.tejasoft.edu.dsa.bbraces.tests.ju.ju5.ut.TestBBRandom;
import org.junit.platform.suite.api.IncludeTags;
import org.junit.platform.suite.api.SelectClasses;
import org.junit.platform.suite.api.Suite;
@Suite
@SelectClasses({TestBBOrdered.class, TestBBRandom.class})
//@SelectPackages("com.tejasoft.dsa.bbraces.**")
@IncludeTags({"slow", "integration"})
public final class TestBBSuites
{
}
package com.tejasoft.tests.ju.ju5.ut.suites.learn;
import org.junit.platform.suite.api.SelectClasses;
import org.junit.platform.suite.api.Suite;
@Suite
@SelectClasses({TestExAdd.class})
public class TestSuiteEx
{
}
I just wish to run the normal junit suite 100 times.. without having to change the usual junit test cases in any way for performance testing... Is every normal junit test needs to be annotated for junitperf suite to work
Spotted an issue with reflection and method access thats fixed now with 1.31.0, so you might want to upgrade to that.
I just wish to run the normal junit suite 100 times.. without having to change the usual junit test cases in any way for performance testing
This is not possible with this framework
To run a performance test you need to annotate the test OR the suite with at least @JUnitPerfTest
to configure how you want to run the perf test and @JUnitPerfTestRequirement
is optional if you want to validate the results programatically.
Is every normal junit test needs to be annotated for junitperf suite to work
No, you should be able to annotate the @Suite
class with @JUnitPerfTest
and all tests in the suite will inherit the @JUnitPerfTest
configuration.
While this suite is run, even other suites are included that does not have JUnit Perf Annotations.. hence it gives the following error and stops
@ConfigurationParameter(key = "junit.jupiter.extensions.autodetection.enabled", value = "true")
: this setting tells junit to enable the JUnitPerfInterceptor
globally. The interceptor will then look at every test and check if the test is annotated directly OR if the test is running as part of a suite that is annotated. Thats the only way junit5 currently provides to hook into suite lifecycle events that i can see.
I suspect this error: org.junit.platform.suite.engine.NoTestsDiscoveredException: Suite [com.tejasoft.edu.dsa.bbraces.tests.ju.ju5.ut.suites.TestBBSuites] did not discover any tests
is because the test was set up to find all tests in the package @SelectPackages("com")
. You might need to be more selective when setting up your perf suite as you showed in your later example (i.e. use @SelectClasses({TestBBOrdered.class, TestBBRandom.class})
or a more restrictive package name)
You could also look at adding @ExcludeClassNamePatterns
or @ExcludePackages
to explicitly exclude TestBBSuites
and TestSuiteEx
from the perf test suite
You could also look at adding
@ExcludeClassNamePatterns
or@ExcludePackages
to explicitly excludeTestBBSuites
andTestSuiteEx
from the perf test suite
Even with 1.31.0 same error..
My PerfSuite is annotated with JUnitPerfTest
package com.tejasoft.tests.ju.ju5.ut.perf.noconnor;
import com.github.noconnor.junitperf.JUnitPerfReportingConfig;
import com.github.noconnor.junitperf.JUnitPerfTest;
import com.github.noconnor.junitperf.JUnitPerfTestActiveConfig;
import com.github.noconnor.junitperf.JUnitPerfTestRequirement;
import org.junit.platform.suite.api.ConfigurationParameter;
import org.junit.platform.suite.api.SelectPackages;
import org.junit.platform.suite.api.Suite;
import static com.tejasoft.tests.ju.ju5.ut.perf.noconnor.utils.ReportingUtils.newHtmlReporter;
@Suite
@SelectPackages("com")
@JUnitPerfTest(totalExecutions = 10)
@JUnitPerfTestRequirement
@ConfigurationParameter(key = "junit.jupiter.extensions.autodetection.enabled", value = "true")
public final class TestPerfNCRSuite
{
@JUnitPerfTestActiveConfig
public static JUnitPerfReportingConfig config =
JUnitPerfReportingConfig.builder().reportGenerator(newHtmlReporter("ncr_perf_suite_report.html"))
.build();
}
However this perf suite includes few others suites that use
package com.tejasoft.edu.dsa.bbraces.tests.ju.ju5.ut.suites;
import com.tejasoft.edu.dsa.bbraces.tests.ju.ju5.ut.TestBBOrdered;
import com.tejasoft.edu.dsa.bbraces.tests.ju.ju5.ut.TestBBRandom;
import org.junit.platform.suite.api.IncludeTags;
import org.junit.platform.suite.api.SelectClasses;
import org.junit.platform.suite.api.Suite;
@Suite
@SelectClasses({TestBBOrdered.class, TestBBRandom.class})
//@SelectPackages("com.tejasoft.dsa.bbraces.**")
@IncludeTags({"slow", "integration"})
public final class TestBBSuites
{
}
In such a context do I need these suites also needed to be annotated specially.. as suite is also a test case, inheritance should work there too..
As these TestBBSuites do work properly as part of the gradle full test goal. They fail with
com.tejasoft.edu.dsa.bbraces.tests.ju.ju5.ut.suites.TestBBSuites
X initializationError
org.junit.platform.suite.engine.NoTestsDiscoveredException: Suite [com.tejasoft.edu.dsa.bbraces.tests.ju.ju5.ut.suites.TestBBSuites] did not discover any tests
only when perfsuite is included as part of test goal or explicitly running perf suite like this
gradle clean test --tests "**.TestPerfNCRSuite"
I tried disabling these TestBBSuites and TestSuiteEx and also with exclude but the same error is seen always..
This is not possible with this framework
I am ok with PerfSuite alone annotated with JUnitPerfTest
However this perfsuite would internally may include other JUnit Suites, as long as these internal suites needed not be annotated I am fine as those suites are part of normal test goal, that run fine as part ofgradle test
goal
That error is not coming from the junitperf framework, its coming from the junit suite engine (i.e. org.junit.platform.suite.engine.SuiteTestDescriptor
):
org.junit.platform.suite.engine.NoTestsDiscoveredException: Suite [com.tejasoft.tests.ju.ju5.ut.suites.TestBBSuites] did not discover any tests
at org.junit.platform.suite.engine.SuiteTestDescriptor.computeTestExecutionResult(SuiteTestDescriptor.java:134)
at org.junit.platform.suite.engine.SuiteTestDescriptor.execute(SuiteTestDescriptor.java:129)
at org.junit.platform.suite.engine.SuiteTestEngine.lambda$execute$0(SuiteTestEngine.java:73)
at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
You can confirm this by removing all reference to junitperf and removing the ConfigurationParameter
and then re-run the TestPerfNCRSuite
class. You will get the same error.
This seems to be an issue that occurs when @SelectPackages
is used and the package scan finds a @Suite
class.
Seems the suite class tests are not expanded after the first level (so the nested suite "appears" to have no tests and the junit suite engine throws an exception).
To work around this issue you can add the following exclusion to the TestPerfNCRSuite
class
@ExcludeClassNamePatterns("com.tejasoft.tests.ju.ju5.ut.suites.*")
Or avoid using @SelectPackages
if possible
java.lang.AssertionError: Error threshold not achieved
at com.github.noconnor.junitperf.statements.PerformanceEvaluationStatement.assertThat(PerformanceEvaluationStatement.java:113)
at com.github.noconnor.junitperf.statements.PerformanceEvaluationStatement.assertThresholdsMet(PerformanceEvaluationStatement.java:95)
at com.github.noconnor.junitperf.statements.PerformanceEvaluationStatement.runParallelEvaluation(PerformanceEvaluationStatement.java:75)
at com.github.noconnor.junitperf.JUnitPerfInterceptor.interceptTestMethod(JUnitPerfInterceptor.java:119)
What does this mean, is it the default time to run all the tests.. etc.. how to control this..
May you wanted to say
threshold value of xxx not met hence it is a error etc..
If you add @JUnitPerfTestRequirement
to the test suite the default values for this annotation will be used to validate the test results. Default values are documented here
The default behaviour is to expect no errors (see allowedErrorPercentage
)
Any exceptions or assertions that occur during the test will cause the test to fail unless you specify a non-zero value for allowedErrorPercentage
or remove the @JUnitPerfTestRequirement
annotation
I have removed still the same error..
package com.tejasoft.tests.ju.ju5.ut.perf.noconnor;
import com.github.noconnor.junitperf.JUnitPerfReportingConfig;
import com.github.noconnor.junitperf.JUnitPerfTest;
import com.github.noconnor.junitperf.JUnitPerfTestActiveConfig;
import com.github.noconnor.junitperf.JUnitPerfTestRequirement;
import org.junit.platform.suite.api.SelectPackages;
import org.junit.platform.suite.api.Suite;
import static com.tejasoft.tests.ju.ju5.ut.perf.noconnor.utils.ReportingUtils.newHtmlReporter;
@Suite
@SelectPackages("com.tejasoft")
//@JUnitPerfTest(totalExecutions = 1000, rampUpPeriodMs=10000, threads = 20, warmUpMs = 1000)
@JUnitPerfTest(totalExecutions = 10)
//@JUnitPerfTestRequirement
public final class TestSuitePerfNCR
{
@JUnitPerfTestActiveConfig
public static JUnitPerfReportingConfig config =
JUnitPerfReportingConfig.builder().reportGenerator(newHtmlReporter("ncr_perf_suite_report.html"))
.build();
}
I misspoke, the default behaviour is to expect no errors (whether or not the @JUnitPerfTestRequirement
annotation is present).
You can enable trace logging on EvaluationTask
to identify the source of the errors.
Or you can allow a certain percentage of errors by annotating with @JUnitPerfTestRequirement
and setting allowedErrorPercentage
to a non 0 value
this says 90% as errors, is it more right to say these are the tests that have failed to meet the performance timeline? or does it also include any other kind of errors.
Also, what is the latency time specified for each method against which the this is shown as error e.g testWithdrawWithSufficientBalanceAndAccess says 90% error w.r.t to what time 90% of tests have taken more time?
The results state that:
testWithdrawalWithSufficientBalanceAndAccess
success
(no assertions within the test method failed)An error is a test assertion failure.
The graph on the left gives you a latency breakdown, it gives you the % of request at or below a certain latency. Because you only had 10 invocations, there will be some interpolation of the percentile distributions. The throughput is calculated as a per second rate, because you did 10 invocations in 22ms, the per second rate is so small its rounded to 0. This throughput is relevant for longer tests with higher invocation counts.
if this test requires an @BeforeEach to be executed before each performance test invocation you'll need to update to 1.33.0
if this test requires an @beforeeach to be executed before each performance test invocation you'll need to update to 1.33.0
Any reason for such behaviors as I expect the JUnitPerf not to touch any of the JUnit each test lifecycle methods as the launcher/engine should take care of running the test methods the usual way.. while JUnitPerf just calls the test suite methods 10 times or to the configured number.
You can enable trace logging on
EvaluationTask
to identify the source of the errors.
Thank You, this the way, I enabled logging, adding for others reference in future
add file logback.xml
in main/resources
folder with content
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<root level="TRACE">
<appender-ref ref="STDOUT" />
</root>
<logger name="com.github.noconnor.junitperf.statements.EvaluationTask" level="TRACE"/>
</configuration>
and add the following 3 dependencies in gradle build file
add("implementation", "ch.qos.logback:logback-core:1.4.8")
add("implementation", "ch.qos.logback:logback-classic:1.4.8")
add("implementation", "org.slf4j:slf4j-api:2.0.7")
Looks at the junit 5 tests dashboard console output to see the trace output
I'll look into adding support to skip tests that fail Assumptions
above lass used to work in x.26.SNAPSHOT version but not with x.30the version it shows errors as
Any clues how to fix this based on the changes made between 26 and 30th version