rhevm-qe-automation / pytest_jira

py.test plugin to integrate with JIRA
GNU General Public License v2.0
29 stars 25 forks source link

Process jira marker during tests collecting #91

Closed gdyuldin closed 6 years ago

gdyuldin commented 6 years ago

Processing jira markers during tests collecting has next pros:

codecov-io commented 6 years ago

Codecov Report

Merging #91 into master will decrease coverage by 0.13%. The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #91      +/-   ##
==========================================
- Coverage   97.96%   97.83%   -0.14%     
==========================================
  Files           1        1              
  Lines         197      185      -12     
==========================================
- Hits          193      181      -12     
  Misses          4        4
Impacted Files Coverage Δ
pytest_jira.py 97.83% <100%> (-0.14%) :arrow_down:

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update a4fdc14...da91c78. Read the comment docs.

lukas-bednar commented 6 years ago

Fixes #68 . @liiight Could you please take a look as well since you reported the issue above?

liiight commented 6 years ago

So this moves plugin logic from test execution to test collection? Makes sense to me. Pros look great.

gdyuldin commented 6 years ago

Yes, it moves logic to collecting.

With next tests

import pytest

@pytest.mark.jira('CND-8629')
def test_pass():
    assert True

@pytest.mark.jira('CND-8629')
def test_fail():
    assert False

@pytest.mark.jira('CND-8629', run=False)
def test_pass_norun():
    assert True

@pytest.mark.jira('CND-8629', run=False)
def test_fail_norun():
    assert False

With enabled xfail_strict it generates next report

<?xml version="1.0" encoding="utf-8"?>
<testsuite errors="0" failures="1" name="pytest" skips="3" tests="4" time="0.581">
    <testcase classname="test_foo" file="test_foo.py" line="4" name="test_pass" time="0.002123117446899414">
        <failure message="[XPASS(strict)] https://jira/browse/CND-8629">[XPASS(strict)] https://jira/browse/CND-8629</failure>
    </testcase>
    <testcase classname="test_foo" file="test_foo.py" line="9" name="test_fail" time="0.0051174163818359375">    <skipped message="expected test failure">https://jira/browse/CND-8629</skipped>
    </testcase>
    <testcase classname="test_foo" file="test_foo.py" line="14" name="test_pass_norun" time="0.00080108642578125">
        <skipped message="https://jira/browse/CND-8629" type="pytest.skip">test_foo.py:14: &lt;py._xmlgen.raw object at 0x7f8368445c18&gt;</skipped>
    </testcase>
    <testcase classname="test_foo" file="test_foo.py" line="19" name="test_fail_norun" time="0.0010704994201660156">
        <skipped message="https://jira/browse/CND-8629" type="pytest.skip">test_foo.py:19: &lt;py._xmlgen.raw object at 0x7f8368445d30&gt;</skipped>
    </testcase>
</testsuite>
lukas-bednar commented 6 years ago

Thank you!