Open nagkumar opened 1 year ago
Its due to the fact that there are multiple threads (20 in this case) updating & reading the same evaluation count counter.
totalExecutions
is a best effort target.
To make it an exact target thread synchronisation logic would need to be introduced and it would be necessary to make sure that synchronisation did not impact latency evaluations which would involve introducing a lot of complexity that may not add a lot of value.
durationMs | Total time to run the test in millisecs (ms) (includes warmup period)
So reading the best effort target, totalExecutions
is optional and one can give durationMs so that for that duration same unit test is called how many numbers of times possible during that duration.
totalExecutions
is optional and if set the test will complete after at least totalExecutions
but may be slightly more than the target as you have already seen.
durationMs
has a default value of 60 secs, so if totalExecutions
is not set the test will run as many executions as it can within the 60 seconds. If totalExecutions
is set, the test will complete when totalExecutions
is reached OR after durationMs
if totalExecutions
is not reached within the 60 seconds (i.e. if totalExecutions=100
but each call takes 10 seconds, then you will reach durationMs
before the target executions is reached).
The only caveat to this is you can set maxExecutionsPerSecond
which will rate limit the executions to a certain number of executions per second. i.e. if you set maxExecutionsPerSecond=100
and durationMs=60,000
the test will attempt to run 100 iterations per second for 60 seconds. (this is useful when you are testing against an API or you don't want to max out the CPU running a very fast test)
durationMs
has a default value of 60 secs, so iftotalExecutions
is not set the test will run as many executions as it can within the 60 seconds. IftotalExecutions
is set, the test will complete whentotalExecutions
is reached OR afterdurationMs
iftotalExecutions
is not reached within the 60 seconds (i.e. iftotalExecutions=100
but each call takes 10 seconds, then you will reachdurationMs
before the target executions is reached).
I am trying to understand those better hence tried @JUnitPerfTest with no options(i.e it must be taking default of durationMs
as 60 seconds,
@Suite
@SelectPackages("com.tejasoft")
@ExcludeClassNamePatterns({"^.*aprt.*$"})
@JUnitPerfTest
public final class TestSuitePerfNCR
{
@JUnitPerfTestActiveConfig
public static JUnitPerfReportingConfig config =
JUnitPerfReportingConfig.builder().reportGenerator(newHtmlReporter("ncr_perf_suite_report.html"))
.build();
}
It has invoked just few unit tests...but each with as high as 4 million times.. why has it not invoked all the unit tests... at least once and iterated with a new suite of all methods again..
Is there any way, I can know what all the tests it has discovered as per the junitperf suite, so that we know it has discovered more but did take only unit tests out of it. any log info to debug
Also, I observed that the test is not completed around 1 min, instead, it is going for > 10 min
So in the suite class when we say durationMs as X, does it take this as the value of each unit test or for the entire suite (how many unit tests the suite has).. Just confirming it, as you already said these annotations are inherited to each test method in the unit tests..
If for each test method..
then I expect all tests are to be run.. however when I ran the same suite the same way.. this time it ran more number of tests
when I gave this @JUnitPerfTest(totalExecutions = 1) the total number of tests run are so many as in the attached pdf
So the junitperf
framework does not discover the test, the junit-platform-suite-engine
does test discovery.
The junit-platform-suite-engine
test driver then iterates over each test in the suite and for each test the junitperf
interceptor is called.
The @JUnitPerfTest
annotation is then applied to each test, so if you've set a durationMs if 60seconds, each test will be called as many times as possible for 60 seconds. If you have not set the tests to run in parallel (using surefire or junit settings) the default behaviour for junit-platform-suite-engine
is to run the test in sequence one after the other.
So if you use the default 60 sec duration, and you have 344 test , 115 skipped, it will take (344-115) mins to complete i.e. the duration is at the individual test level.
For large test suites like you have, you are probably better off using totalExecutions
so each test will be run for a max of totalExecutions
instead of using durationMs
have 344 test , 115 skipped, it will take (344-115)
surprisingly my suite even with this many tests ending in around 12 min.. unable to know why this happening.. however as noted number of tests shown in report are reduced substantially..
So the junitperf framework does not discover the test, the junit-platform-suite-engine does test discovery.
Too clear, that way everytime for the same suite junitperf annotated should always get all the tests as shown the pdf..
May be in the report, you may put total discovered tests and the annotations parameters applied and how many it could test.. also the table that is listing all the tests can also be numbered.. that way easy to observe with numbers in the report
i.e. the similar summary that is seen in JUnit normal reports
You may be hitting a memory limit somewhere trying to run that many tests.
You could try adding a fixed window size for the statistics calulator:
@JUnitPerfTestActiveConfig
public static JUnitPerfReportingConfig config = JUnitPerfReportingConfig.builder()
.reportGenerator(newHtmlReporter("ncr_perf_suite_report.html"))
.statisticsCalculatorSupplier(() -> new DescriptiveStatisticsCalculator(1_000_000))
.build();
Adding totalExecutions
limit would also probably help.
you may put total discovered tests and the annotations parameters applied and how many it could test..
The junitperf
framework does not have access to how many tests were discovered, thats all managed by the junit junit-platform-suite-engine
. junitperf
just get called once for each test.
It would only ever be possible to add a count of executed/skipped tests. Even total duration would be hard to track as there are no suite level lifecycle methods/hooks provided by the junit framework that i can see.
Can you provide a list of exptected tests/class/packages (maybe a pdf or screenshot of the normal junit report)
test.zip - Normal JUnit Report
@JUnitPerfTestActiveConfig
public static JUnitPerfReportingConfig config = JUnitPerfReportingConfig.builder()
.reportGenerator(newHtmlReporter("ncr_perf_suite_report.html"))
.statisticsCalculatorSupplier(() -> new DescriptiveStatisticsCalculator(1_000_000))
.build();
Will try this and update in a day or two.
if junit engine is asking to run it.. each time.. why so many runs happening.. i.e. 4 million times..
memory limit somewhere trying to run that many tests.
I have not seen any such error as my laptop is 64gb i7 11th generation, I shall continue to observe the memory errors.. however it is a posilbity will check the logs..as total suite is normally exiting in around 11 min as it may need to take 180+ min etc//
I have 1000 as totalExecutions, however, innovations shows slightly around > 1000 mody times etc.. I know this not a bug, was looking to capture such a difference as there may be some reason behind this..