Open VDFaller opened 5 years ago
Could you elaborate on your add failure
example? Is it just the addition of file_name
? And which specific Reporter
are you considering modifying? If you don't have a specific Reporter
in mind, one option to consider here might be similar (in idea) to the UtWindowDispatchingReporter but using Include File List
instead of Current Window
.
As for timing, there are a couple of places where we have started down this road. One is the ut elapsed time Matcher
. Another is the UtTimingReporter. Going forward, I see several options.
ut elapsed time
to measure single assertions and something similar to UtTimingReporter
to measure entire files. The former requires modifications of tests and the latter doesn't have a lot of granularity. Setup
/Teardown
to do per-test timings and report them (in Teardown
) through a traditional assertion on the time (works with any Reporter
), or through a custom method specific to your reporter, or through another channel entirely (save to table, global list, etc). This also requires modifications of your test cases.ut test case
(in something like ut timed test case
) so that it times its work and report through one of the channels above. Or perhaps we could use the payload
argument and write a Reporter
that pulls timings out of payload
and reports them. This would require little change to existing tests. We could even decorate ut assert that
within ut timed test case
to time individual assertions. See example of decorated assertion for existing ut test case
.ut timed test case
or ut benchmark
or something like that. This could automatically run several iterations of the body to get reliable timings. See something like Google Benchmark.The last two options are likely general enough to include directly in jsl-hamcrest. For all the options, unless you have a "correct" or "baseline" timing, there can really be no assertion on the current timing. For that reason, the current timing either needs to ride in payload
with an assertion or (more likely) it needs to be given a new Reporter
method entirely. Or maybe even an entirely new type of reporter object (with no add failure
, etc) depending on if the benchmarks can also have assertions.
Sorry, yeah I was thinking generically. Basically the goal is to get the reporter to know which file
/uttestcase
/uttest
/utassert
was run for every ut assert that
that is made, so that it is easier and faster to trace CI/CD Failures for large projects.
And for timing, I was thinking for just benchmarking so that we can watch timings of uttest
s and if they suddenly increase (between JMP versions or commits), that we can get a warning. I could do the Setup
and Teardown
on all of our cases, but I think having the all reporters know the timings of their tests would be useful.
All of this we're writing into the custom reporter I'm working on #56, but I figured if it was something that other reporters could use, put it in here first.
I'm interested in adding the file that the tests came from like @VDFaller mentioned. Specifically, to the JUnitXMLReporter so it would have <testcase name="test1" file="MyTest.jsl" ...>
. Since the JUnit reporter is based on the CollectingReporter I can see two ways to get the file name while the results are being collected:
add failure
, add success
, add unexpected throw
methods.ut concat test label
so that the output could be file name > test case name > test name > label
and we could parse it out in the reporter. This way file name could be available to more than just the JUnitXMLReporter.@emccorkle Do you think either of these make sense or do you see any alternatives?
I think modifying UtCollectingReporter
to collect file info via Vince's method (1) would be fine. Then modifying the reports of UtCollectingReporter
and JUnitXMLReporter
to use that info. That should be a pretty low impact change.
Thanks @emccorkle. I've made this change and various other improvements to the JUnitXMLReporter in #110.
I'd like to see timing and module in the reporters as well.
Timing
Timing would allow us to see if performance changes across different runs. I think having an
HPTime()
on start and end ofuttestcase()
and theutassertthat()
would help us see performance differences. I'm probably missing some places.Module
Also I was thinking about adding on each of the reporter
add X()
functions. For instance,I'm not sure what impact that would have elsewhere though.