litmuschaos / litmus

Litmus helps SREs and developers practice chaos engineering in a Cloud-native way. Chaos experiments are published at the ChaosHub (https://hub.litmuschaos.io). Community notes is at https://hackmd.io/a4Zu_sH4TZGeih-xCimi3Q
https://litmuschaos.io
Apache License 2.0
4.45k stars 698 forks source link

Litmus Junit report #2839

Open strus38 opened 3 years ago

strus38 commented 3 years ago

Hi Is there a way to produce a JUnit report based on Litmus experiment reports?

ksatchit commented 3 years ago

Hi @strus38 ! Thanks for bringing this up - this is a feature we will be integrating soon. Would you like to share any pointers/preferences?

jnsrikanth commented 3 years ago

Hi @ksatchit, Is there a basic test report generation that can be done in JSON or CSV format right now from the ChaosResult (yaml) objects or from the experiment logs ? What system of reporting is already in place in LitmusChaos? I am thinking of extracting test results from ChaosResult object in json format using JSONPATH and then running a JSON parser to get the required key-value pairs from it. Any suggestions would be welcome.

-Srikanth

ksatchit commented 3 years ago

Hey @strus38 @jnsrikanth what do you think of the following ways of getting result info? Would be interested in hearing your thoughts!

LitmusChaos xUnit result XML format

Note:

<?xml version="1.0" encoding="UTF-8"?>
<testcase probes="2" probefailures="1" name="catalogue-latency-10s">
   <properties>
      <property name="experiment.version" value="2.1.0" />
   </properties>
   <probes>
      <probe type="http-probe" name="frontend-availability-check" result="pass" />
      <probe type="cmd-probe" name="sock-shop-user-crud-check" result="pass" />
      <probe type="prom-probe" name="service-qps-check" result="fail">
         <probefailure status="Failed" message="actual value {180.99} less than {200}" />
      </probe>
   </probes>
   <fault name="pod-network-latency" targetkind="deployment" targetname="catalogue">
      <testfailure verdict="Failed" message="steady-state validation failed" />
   </fault>
</testcase>

LitmusChaos jUnit result XML generated via litmus-sdk from chaosresult YAML input

Note:

<?xml version="1.0" encoding="UTF-8"?>
<testsuite tests="1" failures="1" name="{{ .Metadata.Name }}"> <!-- catalogue-latency-10s -->
  <properties>
    <property name="fault.name" value="{{ .Spec.Experiment }}"></property> <!-- pod-network-latency --> 
    <property name="target.kind" value="{{ .Status.Targets[0].Kind }}"></property> <!-- deployment -->
    <property name="target.name" value="{{ .Status.Targets[0].Name }}"></property> <!-- catalogue -->
  </properties>
  <testcase classname="http-probe" name="frontend-availability-check">
    <failure message="Failed" type="Actual output does not meet criteria - 200 OK"></failure>
  </testcase>
  <testcase classname="cmd-probe" name="service-self-heal-under-mttr"></testcase>
</testsuite>

LitmusChaos jUnit result XML format based on jenkins/ant - generated via experiment as an artifact

Note:

<?xml version="1.0" encoding="UTF-8"?>
<testsuite tests="1" failures="1" time="03.20" name="catalogue-latency-10s"> 
  <properties>
    <property name="fault.name" value="pod-network-latency"></property> 
    <property name="target.kind" value="deployment"></property> 
    <property name="target.name" value="catalogue"></property>
    <property name="app.recovery.time" value="00.30"></property>
  </properties>
  <testcase classname="http-probe" name="frontend-availability-check">
    <failure message="Failed" type="Actual output does not meet criteria - 200 OK"></failure>
  </testcase>
  <testcase classname="cmd-probe" name="service-self-heal-under-mttr"></testcase>
</testsuite>

LitmusChaos ChaosResult converted to XML format

Note:

<?xml version="1.0" encoding="UTF-8" ?>
<root>
  <apiVersion>litmuschaos.io/v1alpha1</apiVersion>
  <kind>ChaosResult</kind>
  <metadata>
    <name>helloservice-pod-delete-pod-delete</name>
    <namespace>litmus</namespace>
    <resourceVersion>8315917</resourceVersion>
  </metadata>
  <spec>
    <engine>helloservice-pod-delete</engine>
    <experiment>pod-delete</experiment>
  </spec>
  <status>
    <experimentStatus>
      <failStep>N/A</failStep>
      <phase>Completed</phase>
      <probeSuccessPercentage>100</probeSuccessPercentage>
      <verdict>Pass</verdict>
    </experimentStatus>
    <history>
      <failedRuns>0</failedRuns>
      <passedRuns>1</passedRuns>
      <stoppedRuns>0</stoppedRuns>
      <targets>
        <chaosStatus>targeted</chaosStatus>
        <kind>deployment</kind>
        <name>hello</name>
      </targets>
    </history>
  </status>
</root>
jordigilh commented 3 years ago

Since it is runtime generated, we will also have extra info - such as run_duration, app reschedule/startup/recovery time etc., (which is not available with offline render)

<testsuite tests="1" failures="1" time="03.20" name="catalogue-latency-10s"> 

@ksatchit Can this information be included in the chaos result CR, so that offline generation can also display it?