pytest-dev / pytest-forked

extracted --boxed from pytest-xdist to ensure backward compat
MIT License
62 stars 21 forks source link

test_xfail[strict_xfail|non_strict_xfail] fail with pytest 7.2.0 #79

Closed mweinelt closed 1 year ago

mweinelt commented 1 year ago

After upgrading pytest from 7.1.3 to 7.2.0 in nixpkgs we're seeing the following test failures:

pytest-forked> ============================= test session starts ==============================
pytest-forked> platform linux -- Python 3.10.9, pytest-7.2.0, pluggy-1.0.0 -- /nix/store/7c1qwrqf1grxa31wvq2na0vg2val7p9w-python3-3.10.9/bin/python3.10
pytest-forked> cachedir: .pytest_cache
pytest-forked> rootdir: /build/pytest-forked-1.4.0, configfile: tox.ini
pytest-forked> plugins: forked-1.4.0
pytest-forked> collected 10 items                                                             
pytest-forked> 
pytest-forked> testing/test_boxed.py::test_functional_boxed PASSED                      [ 10%]
pytest-forked> testing/test_boxed.py::test_functional_boxed_per_test PASSED             [ 20%]
pytest-forked> testing/test_boxed.py::test_functional_boxed_capturing[no] PASSED        [ 30%]
pytest-forked> testing/test_boxed.py::test_functional_boxed_capturing[sys] XFAIL (capture cleanup needed) [ 40%]
pytest-forked> testing/test_boxed.py::test_functional_boxed_capturing[fd] XFAIL (capture cleanup needed) [ 50%]
pytest-forked> testing/test_boxed.py::test_is_not_boxed_by_default PASSED               [ 60%]
pytest-forked> testing/test_xfail_behavior.py::test_xfail[strict xfail] FAILED          [ 70%]
pytest-forked> testing/test_xfail_behavior.py::test_xfail[strict xpass] PASSED          [ 80%]
pytest-forked> testing/test_xfail_behavior.py::test_xfail[non-strict xfail] FAILED      [ 90%]
pytest-forked> testing/test_xfail_behavior.py::test_xfail[non-strict xpass] PASSED      [100%]
pytest-forked> 
pytest-forked> =================================== FAILURES ===================================
pytest-forked> ___________________________ test_xfail[strict xfail] ___________________________
pytest-forked> 
pytest-forked> is_crashing = True, is_strict = True
pytest-forked> testdir = <Testdir local('/build/pytest-of-nixbld/pytest-0/test_xfail0')>
pytest-forked> 
pytest-forked>     @pytest.mark.parametrize(
pytest-forked>         ("is_crashing", "is_strict"),
pytest-forked>         (
pytest-forked>             pytest.param(True, True, id="strict xfail"),
pytest-forked>             pytest.param(False, True, id="strict xpass"),
pytest-forked>             pytest.param(True, False, id="non-strict xfail"),
pytest-forked>             pytest.param(False, False, id="non-strict xpass"),
pytest-forked>         ),
pytest-forked>     )
pytest-forked>     def test_xfail(is_crashing, is_strict, testdir):
pytest-forked>         """Test xfail/xpass/strict permutations."""
pytest-forked>         # pylint: disable=possibly-unused-variable
pytest-forked>         sig_num = signal.SIGTERM.numerator
pytest-forked>     
pytest-forked>         test_func_body = (
pytest-forked>             "os.kill(os.getpid(), signal.SIGTERM)" if is_crashing else "assert True"
pytest-forked>         )
pytest-forked>     
pytest-forked>         if is_crashing:
pytest-forked>             # marked xfailed and crashing, no matter strict or not
pytest-forked>             expected_letter = "x"  # XFAILED
pytest-forked>             expected_lowercase = "xfailed"
pytest-forked>             expected_word = "XFAIL"
pytest-forked>         elif is_strict:
pytest-forked>             # strict and not failing as expected should cause failure
pytest-forked>             expected_letter = "F"  # FAILED
pytest-forked>             expected_lowercase = "failed"
pytest-forked>             expected_word = FAILED_WORD
pytest-forked>         elif not is_strict:
pytest-forked>             # non-strict and not failing as expected should cause xpass
pytest-forked>             expected_letter = "X"  # XPASS
pytest-forked>             expected_lowercase = "xpassed"
pytest-forked>             expected_word = "XPASS"
pytest-forked>     
pytest-forked>         session_start_title = "*==== test session starts ====*"
pytest-forked>         loaded_pytest_plugins = "plugins: forked*"
pytest-forked>         collected_tests_num = "collected 1 item"
pytest-forked>         expected_progress = "test_xfail.py {expected_letter!s}*".format(**locals())
pytest-forked>         failures_title = "*==== FAILURES ====*"
pytest-forked>         failures_test_name = "*____ test_function ____*"
pytest-forked>         failures_test_reason = "[XPASS(strict)] The process gets terminated"
pytest-forked>         short_test_summary_title = "*==== short test summary info ====*"
pytest-forked>         short_test_summary = "{expected_word!s} test_xfail.py::test_function".format(
pytest-forked>             **locals()
pytest-forked>         )
pytest-forked>         if expected_lowercase == "xpassed":
pytest-forked>             # XPASS wouldn't have the crash message from
pytest-forked>             # pytest-forked because the crash doesn't happen
pytest-forked>             short_test_summary = " ".join(
pytest-forked>                 (
pytest-forked>                     short_test_summary,
pytest-forked>                     "The process gets terminated",
pytest-forked>                 )
pytest-forked>             )
pytest-forked>         reason_string = (
pytest-forked>             "  reason: The process gets terminated; "
pytest-forked>             "pytest-forked reason: "
pytest-forked>             "*:*: running the test CRASHED with signal {sig_num:d}".format(**locals())
pytest-forked>         )
pytest-forked>         total_summary_line = "*==== 1 {expected_lowercase!s} in 0.*s* ====*".format(
pytest-forked>             **locals()
pytest-forked>         )
pytest-forked>     
pytest-forked>         expected_lines = (
pytest-forked>             session_start_title,
pytest-forked>             loaded_pytest_plugins,
pytest-forked>             collected_tests_num,
pytest-forked>             expected_progress,
pytest-forked>         )
pytest-forked>         if expected_word == FAILED_WORD:
pytest-forked>             # XPASS(strict)
pytest-forked>             expected_lines += (
pytest-forked>                 failures_title,
pytest-forked>                 failures_test_name,
pytest-forked>                 failures_test_reason,
pytest-forked>             )
pytest-forked>         expected_lines += (
pytest-forked>             short_test_summary_title,
pytest-forked>             short_test_summary,
pytest-forked>         )
pytest-forked>         if expected_lowercase == "xpassed" and expected_word == FAILED_WORD:
pytest-forked>             # XPASS(strict)
pytest-forked>             expected_lines += (reason_string,)
pytest-forked>         expected_lines += (total_summary_line,)
pytest-forked>     
pytest-forked>         test_module = testdir.makepyfile(
pytest-forked>             f"""
pytest-forked>             import os
pytest-forked>             import signal
pytest-forked>     
pytest-forked>             import pytest
pytest-forked>     
pytest-forked>             # The current implementation emits RuntimeWarning.
pytest-forked>             pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
pytest-forked>     
pytest-forked>             @pytest.mark.xfail(
pytest-forked>                 reason='The process gets terminated',
pytest-forked>                 strict={is_strict!s},
pytest-forked>             )
pytest-forked>             @pytest.mark.forked
pytest-forked>             def test_function():
pytest-forked>                 {test_func_body!s}
pytest-forked>             """
pytest-forked>         )
pytest-forked>     
pytest-forked>         pytest_run_result = testdir.runpytest(test_module, "-ra")
pytest-forked> >       pytest_run_result.stdout.fnmatch_lines(expected_lines)
pytest-forked> E       Failed: fnmatch: '*==== test session starts ====*'
pytest-forked> E          with: '============================= test session starts =============================='
pytest-forked> E       nomatch: 'plugins: forked*'
pytest-forked> E           and: 'platform linux -- Python 3.10.9, pytest-7.2.0, pluggy-1.0.0'
pytest-forked> E           and: 'rootdir: /build/pytest-of-nixbld/pytest-0/test_xfail0'
pytest-forked> E       fnmatch: 'plugins: forked*'
pytest-forked> E          with: 'plugins: forked-1.4.0'
pytest-forked> E       exact match: 'collected 1 item'
pytest-forked> E       nomatch: 'test_xfail.py x*'
pytest-forked> E           and: ''
pytest-forked> E       fnmatch: 'test_xfail.py x*'
pytest-forked> E          with: 'test_xfail.py x                                                          [100%]'
pytest-forked> E       nomatch: '*==== short test summary info ====*'
pytest-forked> E           and: ''
pytest-forked> E       fnmatch: '*==== short test summary info ====*'
pytest-forked> E          with: '=========================== short test summary info ============================'
pytest-forked> E       nomatch: 'XFAIL test_xfail.py::test_function'
pytest-forked> E           and: 'XFAIL test_xfail.py::test_function - reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15'
pytest-forked> E           and: '============================== 1 xfailed in 0.01s =============================='
pytest-forked> E       remains unmatched: 'XFAIL test_xfail.py::test_function'
pytest-forked> 
pytest-forked> /build/pytest-forked-1.4.0/testing/test_xfail_behavior.py:122: Failed
pytest-forked> ----------------------------- Captured stdout call -----------------------------
pytest-forked> ============================= test session starts ==============================
pytest-forked> platform linux -- Python 3.10.9, pytest-7.2.0, pluggy-1.0.0
pytest-forked> rootdir: /build/pytest-of-nixbld/pytest-0/test_xfail0
pytest-forked> plugins: forked-1.4.0
pytest-forked> collected 1 item
pytest-forked> 
pytest-forked> test_xfail.py x                                                          [100%]
pytest-forked> 
pytest-forked> =========================== short test summary info ============================
pytest-forked> XFAIL test_xfail.py::test_function - reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15
pytest-forked> ============================== 1 xfailed in 0.01s ==============================
pytest-forked> _________________________ test_xfail[non-strict xfail] _________________________
pytest-forked> 
pytest-forked> is_crashing = True, is_strict = False
pytest-forked> testdir = <Testdir local('/build/pytest-of-nixbld/pytest-0/test_xfail2')>
pytest-forked> 
pytest-forked>     @pytest.mark.parametrize(
pytest-forked>         ("is_crashing", "is_strict"),
pytest-forked>         (
pytest-forked>             pytest.param(True, True, id="strict xfail"),
pytest-forked>             pytest.param(False, True, id="strict xpass"),
pytest-forked>             pytest.param(True, False, id="non-strict xfail"),
pytest-forked>             pytest.param(False, False, id="non-strict xpass"),
pytest-forked>         ),
pytest-forked>     )
pytest-forked>     def test_xfail(is_crashing, is_strict, testdir):
pytest-forked>         """Test xfail/xpass/strict permutations."""
pytest-forked>         # pylint: disable=possibly-unused-variable
pytest-forked>         sig_num = signal.SIGTERM.numerator
pytest-forked>     
pytest-forked>         test_func_body = (
pytest-forked>             "os.kill(os.getpid(), signal.SIGTERM)" if is_crashing else "assert True"
pytest-forked>         )
pytest-forked>     
pytest-forked>         if is_crashing:
pytest-forked>             # marked xfailed and crashing, no matter strict or not
pytest-forked>             expected_letter = "x"  # XFAILED
pytest-forked>             expected_lowercase = "xfailed"
pytest-forked>             expected_word = "XFAIL"
pytest-forked>         elif is_strict:
pytest-forked>             # strict and not failing as expected should cause failure
pytest-forked>             expected_letter = "F"  # FAILED
pytest-forked>             expected_lowercase = "failed"
pytest-forked>             expected_word = FAILED_WORD
pytest-forked>         elif not is_strict:
pytest-forked>             # non-strict and not failing as expected should cause xpass
pytest-forked>             expected_letter = "X"  # XPASS
pytest-forked>             expected_lowercase = "xpassed"
pytest-forked>             expected_word = "XPASS"
pytest-forked>     
pytest-forked>         session_start_title = "*==== test session starts ====*"
pytest-forked>         loaded_pytest_plugins = "plugins: forked*"
pytest-forked>         collected_tests_num = "collected 1 item"
pytest-forked>         expected_progress = "test_xfail.py {expected_letter!s}*".format(**locals())
pytest-forked>         failures_title = "*==== FAILURES ====*"
pytest-forked>         failures_test_name = "*____ test_function ____*"
pytest-forked>         failures_test_reason = "[XPASS(strict)] The process gets terminated"
pytest-forked>         short_test_summary_title = "*==== short test summary info ====*"
pytest-forked>         short_test_summary = "{expected_word!s} test_xfail.py::test_function".format(
pytest-forked>             **locals()
pytest-forked>         )
pytest-forked>         if expected_lowercase == "xpassed":
pytest-forked>             # XPASS wouldn't have the crash message from
pytest-forked>             # pytest-forked because the crash doesn't happen
pytest-forked>             short_test_summary = " ".join(
pytest-forked>                 (
pytest-forked>                     short_test_summary,
pytest-forked>                     "The process gets terminated",
pytest-forked>                 )
pytest-forked>             )
pytest-forked>         reason_string = (
pytest-forked>             "  reason: The process gets terminated; "
pytest-forked>             "pytest-forked reason: "
pytest-forked>             "*:*: running the test CRASHED with signal {sig_num:d}".format(**locals())
pytest-forked>         )
pytest-forked>         total_summary_line = "*==== 1 {expected_lowercase!s} in 0.*s* ====*".format(
pytest-forked>             **locals()
pytest-forked>         )
pytest-forked>     
pytest-forked>         expected_lines = (
pytest-forked>             session_start_title,
pytest-forked>             loaded_pytest_plugins,
pytest-forked>             collected_tests_num,
pytest-forked>             expected_progress,
pytest-forked>         )
pytest-forked>         if expected_word == FAILED_WORD:
pytest-forked>             # XPASS(strict)
pytest-forked>             expected_lines += (
pytest-forked>                 failures_title,
pytest-forked>                 failures_test_name,
pytest-forked>                 failures_test_reason,
pytest-forked>             )
pytest-forked>         expected_lines += (
pytest-forked>             short_test_summary_title,
pytest-forked>             short_test_summary,
pytest-forked>         )
pytest-forked>         if expected_lowercase == "xpassed" and expected_word == FAILED_WORD:
pytest-forked>             # XPASS(strict)
pytest-forked>             expected_lines += (reason_string,)
pytest-forked>         expected_lines += (total_summary_line,)
pytest-forked>     
pytest-forked>         test_module = testdir.makepyfile(
pytest-forked>             f"""
pytest-forked>             import os
pytest-forked>             import signal
pytest-forked>     
pytest-forked>             import pytest
pytest-forked>     
pytest-forked>             # The current implementation emits RuntimeWarning.
pytest-forked>             pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
pytest-forked>     
pytest-forked>             @pytest.mark.xfail(
pytest-forked>                 reason='The process gets terminated',
pytest-forked>                 strict={is_strict!s},
pytest-forked>             )
pytest-forked>             @pytest.mark.forked
pytest-forked>             def test_function():
pytest-forked>                 {test_func_body!s}
pytest-forked>             """
pytest-forked>         )
pytest-forked>     
pytest-forked>         pytest_run_result = testdir.runpytest(test_module, "-ra")
pytest-forked> >       pytest_run_result.stdout.fnmatch_lines(expected_lines)
pytest-forked> E       Failed: fnmatch: '*==== test session starts ====*'
pytest-forked> E          with: '============================= test session starts =============================='
pytest-forked> E       nomatch: 'plugins: forked*'
pytest-forked> E           and: 'platform linux -- Python 3.10.9, pytest-7.2.0, pluggy-1.0.0'
pytest-forked> E           and: 'rootdir: /build/pytest-of-nixbld/pytest-0/test_xfail2'
pytest-forked> E       fnmatch: 'plugins: forked*'
pytest-forked> E          with: 'plugins: forked-1.4.0'
pytest-forked> E       exact match: 'collected 1 item'
pytest-forked> E       nomatch: 'test_xfail.py x*'
pytest-forked> E           and: ''
pytest-forked> E       fnmatch: 'test_xfail.py x*'
pytest-forked> E          with: 'test_xfail.py x                                                          [100%]'
pytest-forked> E       nomatch: '*==== short test summary info ====*'
pytest-forked> E           and: ''
pytest-forked> E       fnmatch: '*==== short test summary info ====*'
pytest-forked> E          with: '=========================== short test summary info ============================'
pytest-forked> E       nomatch: 'XFAIL test_xfail.py::test_function'
pytest-forked> E           and: 'XFAIL test_xfail.py::test_function - reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15'
pytest-forked> E           and: '============================== 1 xfailed in 0.01s =============================='
pytest-forked> E       remains unmatched: 'XFAIL test_xfail.py::test_function'
pytest-forked> 
pytest-forked> /build/pytest-forked-1.4.0/testing/test_xfail_behavior.py:122: Failed
pytest-forked> ----------------------------- Captured stdout call -----------------------------
pytest-forked> ============================= test session starts ==============================
pytest-forked> platform linux -- Python 3.10.9, pytest-7.2.0, pluggy-1.0.0
pytest-forked> rootdir: /build/pytest-of-nixbld/pytest-0/test_xfail2
pytest-forked> plugins: forked-1.4.0
pytest-forked> collected 1 item
pytest-forked> 
pytest-forked> test_xfail.py x                                                          [100%]
pytest-forked> 
pytest-forked> =========================== short test summary info ============================
pytest-forked> XFAIL test_xfail.py::test_function - reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15
pytest-forked> ============================== 1 xfailed in 0.01s ==============================
pytest-forked> =========================== short test summary info ============================
pytest-forked> XFAIL testing/test_boxed.py::test_functional_boxed_capturing[sys] - capture cleanup needed
pytest-forked> XFAIL testing/test_boxed.py::test_functional_boxed_capturing[fd] - capture cleanup needed
pytest-forked> ==================== 2 failed, 6 passed, 2 xfailed in 0.41s ====================
pytest-forked> /nix/store/a45y55p9zjyxvzqyvk8l1diy0c3jrb03-stdenv-linux/setup: line 1570: pop_var_context: head of shell_variables not a function context
webknjaz commented 1 year ago

I think it's already fixed on the main branch.

mweinelt commented 1 year ago

Confirmed fixed on main. Thanks!