Closed DanielBarberaDimoftache-TomTom closed 2 days ago
Hi Daniel, thanks for the nice words. I am open for enhancements. Can you specify further what exactly to change? "etc." is a bit vague, you almost sound like my PO :-P
Hey! Apologies, I'll be more specific. Although perhaps I should consider a career change.... hmm. :)
The general idea is to add information as attributes to each testcase
in the resulting XML file. As to what attributes, we're still investigating on our side which ones matter the most to us, but the general idea is that the more, the merrier.
Right now, looking at an ActionTestSummary
for example, we could extract:
iteration
and totalIterations
)Let me know if that sounds good to you and perhaps I can open a PR and start aligning on the matter. ^^
Sounds good to me. I can open a PR, if you like and add you to the pool of contributors.
Hey! Hope you're keeping well. To keep you updated on this: we're probably going with our own solution based on XCResultKit instead of extending XCResultParser.
We figured out that the sourcefile mentioned in the TestFailureSummary references the top-level file where the error occurred, which does not always correlate to the sourcefile hosting a particular test (very important for us).
To get the sourcefile of a test, you have to dig into its stack trace in order to find it. Gathering this and other attributes results in an important degradation in performance of the tool, especially noticeable upon running a PoC with this feature enabled on a massive (100 MB) .xcresult
bundle.
Thanks for your time and for the tool, however. We've found it extremely useful. :)
Hi Daniel, you are very welcome. And yes, I investigated a little bit and came across the same problems as you describe. I hope you find a good solution!
Hey, first of all, thanks for the awesome tool. We're using it internally for test result processing, and it's been great so far.
We'd like to extend the JUnit XML output to include additional attributes included in the
.xcresult
bundle, like the name of the source file of the test, the column and row number where the failure occurred, etc.We're more than happy to contribute upstream, we'd love to hear if you're interested so we can get aligned on this. :)