dorny / test-reporter

Displays test results from popular testing frameworks directly in GitHub
MIT License
767 stars 192 forks source link

Java JUnit support #81

Open dorny opened 3 years ago

dorny commented 3 years ago

Tracking issue for JUnit support.

Experimental support was added in v1.3.0 with java-junit reporter. Implementation is based on test result files taken from lhotari/pulsar.

Due to lack of my own experience/interest with Java ecosystem no java project to create test fixtures was added. There is also no documentation how to use JUnit to get XML with results - I expect there is plenty of other resources about this topic.

@lhotari Here you can see how it looks: https://github.com/dorny/pulsar/runs/2052225393?check_suite_focus=true My setup is based on your lh-refactor-pulsar-ci-with-retries branch. By purpose I made one test fail to verify failure annotations are working. I also modified CI workflow to upload merged-test-report.xml instead of XML per test class. It works both ways but with XML per test class approach you get more noise in the report. I've also noticed there are some duplicates. Multiple test results with same name are also present in XML so it's probably not a bug. Still I'm not sure why is that and If I should handle it in some specific way. Please keep me updated how it works for you.

lhotari commented 3 years ago

Wow! Nice work @dorny . This is really helpful. Thank you!

btw. In the Pulsar build, there are multiple reason for the duplicates. Some tests are run multiple times with different configurations. There are also a few cases where the same class name is reused in different modules. There's also currently some test retries in place for handling flaky tests. The solution is currently such that if any of the test retries succeeds, the build will pass. There are plans to change the flaky test handling so that test retries will be enabled only for tests that are categorized as "quarantined".

I'll let you know how the work proceeds. Thank you once again for putting the efforts in supporting java-junit and testing it extensively with the Pulsar build!

johanlofberg commented 3 years ago

Complete beginner in the field so not sure if this is a user problem (i.e. me) or an issue in the test-reporter.

I am using the MATLAB action to generate JUnit xml files via MATLABs test system , and then run test-reporter on the generated files. Here is an example of a generated xml

<?xml version="1.0" encoding="utf-8"?>
<testsuites>
   <testsuite errors="0" failures="0" name="test_operator_abs" skipped="0" tests="2"
              time="0.50334">
      <testcase classname="test_operator_abs" name="test1" time="0.23014"/>
      <testcase classname="test_operator_abs" name="test2" time="0.2732"/>
   </testsuite>
   <testsuite errors="0" failures="0" name="test_operator_alldifferent" skipped="0" tests="2"
              time="0.57926">
      <testcase classname="test_operator_alldifferent" name="test1" time="0.29553"/>
      <testcase classname="test_operator_alldifferent" name="test2" time="0.28373"/>
   </testsuite>
</testsuites>

When test-reporter works on this file, I receive the following

  with:
    name: Junit
    path: *.xml
    reporter: java-junit
    list-suites: all
    list-tests: all
    max-annotations: 10
    fail-on-error: true
    token: ***
Check runs will be created with SHA=f82f3d9235619082511ed24aadc1ab5a73a27e40
Listing all files tracked by git
Found 1783 files tracked by GitHub
Using test report parser 'java-junit'
Creating test report Junit
  Processing test results from operators.xml
Error: Cannot read property 'time' of undefined
johanlofberg commented 3 years ago

Found it: The outer<testsuites> is not expected and causes issues. trimming those and it works.

johanlofberg commented 3 years ago

...although not completely, as only the first suite is added to the report. The second test_alldifferent is lost when i delete the opening and closing testsuites commands which makes it run

dorny commented 3 years ago

Hi! Thanks for reporting the issue šŸ‘ I've only tested the processing of Java JUnit XML against one example so I'm not that surprised about this. Anyway, it should be quite easy to fix. I will look into it asap.

johanlofberg commented 3 years ago

Great to hear, your package appears to be what can move me from an home-baked system to full automation in github so thanks for your efforts.

dorny commented 3 years ago

@johanlofberg: The issue was not in presence of <testsuites> - parser already supported it. The time attribute was missing.

JUnit XML is a good example of something that looks like a standard but actually, every implementation is different. In your case, the XML was created by Matlab instead of Java library. I've modified the parser so it can handle it.

This action has no official support for Matlab. I would have to add tests and some documentation and unfortunately, I don't have time for it now :) Anyway if you find some other issue please report it and I will try to fix it.

Fix is here: #115

johanlofberg commented 3 years ago

Awesome. All 29 tests tests in the suite are now listed and included, but it appears to misclassify the 3 failed tests as they are listed as pass despite having failed and being listed as such in the xml

   <testsuite errors="1" failures="0" name="test_sdpvar_geomean_complex" skipped="0"
              tests="1"
              time="1.9047">
      <testcase classname="test_sdpvar_geomean_complex" name="test1" time="1.9047">
         <error>Error occurred in test_sdpvar_geomean_complex/test1 and it did not run to completion.
    ---------
    Error ID:
    ---------
    'MATLAB:assertion:failed'
    --------------
    Error Details:
    --------------
    Error using test_sdpvar_geomean_complex&gt;test1 (line 12)
    Assertion failed.</error>
      </testcase>
   </testsuite>

https://github.com/yalmip/YALMIP/runs/2579024508?check_suite_focus=true

johanlofberg commented 3 years ago

created xml junit.zip

johanlofberg commented 3 years ago

I see now that I am raising errors instead of failures so the problem is on my side, as your code only check pass/fail/skip

dorny commented 3 years ago

Well, both errors and failures are valid in JUnit. It just happens that XML I've worked with during implementation had only failures. I will make it work with errors too. Hopefully this is the last obstacle. I will let you know when it's ready.

garysmith-img commented 3 years ago

Hey, we've recently integrated your action into our workflow and really like the results.

I realised today the issue above where errors get treated like passes is also effecting us.

Let me know if you want some help testing your fix.

Thanks for your efforts!

dorny commented 3 years ago

I've fixed the issue in #118 It should handle the <error> and <failure> the same way. The v1 branch already points to this version.

Could you please confirm if it works for you now or is there any other issue?

garysmith-img commented 3 years ago

So far so good. From what I can see it seems to be handling the errors now. I'll let you know if I find anything else.

Thanks again.

Gakk commented 3 years ago

First; thanks for a great action helping us migrate to GitHub šŸ˜ƒšŸ‘

We are using NUnit with its output transformation to get JUnit-style results. But when a test fails I don't get the failure message anywhere in your generated report, just first line/item of the stack trace.

Is there a way, when a message is present, to have the message-attribute show as the error text, and put the complete stack trace under Raw output and Show more?

NOTE: I am parsing with java-junit, as jest-junit fails with an error: image

Test result XML:

<testcase name="SearchStreetsWithFilterNoHit" assertions="0" time="0.008161" status="Failed" classname="MyAppName.Client.Adapters.MapAdapter.MapAddressSearchTest">
  <failure message="System.IO.DirectoryNotFoundException : Could not find language resources for neither given culture 'nb-NO' or fallback 'en-US'.">   at MyAppName.Client.Adapters.MapAdapter.MapNavigation..ctor(IMapAdapter mapAdapter, ApplicationViewModel applicationViewModel) in C:\actions-runner\_work\MyAppName7\MyAppName7\MyAppName.Client\src\Adapters\MapAdapter\MapNavigation.cs:line 337
   at MyAppName.Client.Adapters.MapAdapter.MapBaseTest.Setup() in C:\actions-runner\_work\MyAppName7\MyAppName7\MyAppName.Client\src\Adapters\MapAdapter\MapBaseTest.cs:line 34</failure>
</testcase>

GitHub action report:

āŒ SearchStreetsWithFilterNoHit
    at MyAppName.Client.Adapters.MapAdapter.MapNavigation..ctor(IMapAdapter mapAdapter, ApplicationViewModel applicationViewModel) in C:\actions-runner\_work\MyAppName7\MyAppName7\MyAppName.Client\src\Adapters\MapAdapter\MapNavigation.cs:line 337

GitHub code annotation:

Clicking "Raw output" shows the rest of the stack trace: image

GitHub action annotation:

image

omBratteng commented 2 years ago

Just for fun, I tested with junit file export from pytest And it actually works https://github.com/omBratteng/dorny-test-reporter-pytest-junit/runs/3535713080

frankjkelly commented 1 week ago

Another person here who wishes we had the following (or a workaround)

Is there a way, when a message is present, to have the message-attribute show as the error text, and put the complete stack trace under Raw output and Show more?