Open k3v8ns opened 2 years ago
Interesting. What OS is this on?
Can you reproduce this without pytest-timer
?
(allBase39) ➜ temp pip uninstall pytest-timer
Found existing installation: pytest-timer 0.0.11
Uninstalling pytest-timer-0.0.11:
Would remove:
/opt/miniconda3/envs/allBase39/lib/python3.9/site-packages/pytest_timer-0.0.11.dist-info/*
/opt/miniconda3/envs/allBase39/lib/python3.9/site-packages/pytest_timer/*
Proceed (Y/n)?
Successfully uninstalled pytest-timer-0.0.11
(allBase39) ➜ temp pytest
Test session starts (platform: darwin, Python 3.9.7, pytest 6.2.5, pytest-sugar 0.9.4)
rootdir: /Users/xtemp/temp, configfile: pytest.ini
plugins: sugar-0.9.4, allure-pytest-2.9.45, html-3.1.1, metadata-1.11.0, xdist-2.4.0, timeout-1.4.2, json-report-1.4.0
collecting ...
testaaa.py::Testaseryhdsdaaaaa.test_adf ✓ 10% █
testaaa.py::Testertqwetaaaaa.test_adfs ✓ 20% ██
testaaa.py::Testasaaaa.test_adfe ✓ 30% ███
testaaa.py::Testbbbbbb.test_aaaaasssa ✓ 40% ████
testaaa.py::Testbbbbbb.test_aaadaaasssammmm ✓ 50% █████
testaaa.py::Testbbbbbb.test_aaadaaasssa3 ✓ 60% ██████
testaaa.py::Testbbbbbb.test_aaadaaasssa0 ✓ 70% ███████
testaaa.py::Testbbbbbb.test_aaaajjladaaassaaasa0 ✓ 80% ████████
testaaa.py::Testbbbbbb.test_aaadaaasadfsaaasa0 ✓ 90% █████████
testaaa.py::Testbbbbbb.test_aaadaaassaadaasa0 ✓ 100% ██████████
------------------------------------------ generated html file: file:///Users/xtemp/temp/temp/index.html ------------------------------------------
Results (0.05s):
20 passed
Interesting. What OS is this on?
macos big sur v 11.6
Thanks, I'll see if I can repro locally.
I experienced the same issue -- this behavior only occurs when run in combination with pytest-sugar
since they re-define a lot of hooks.
I experienced the same issue -- this behavior only occurs when run in combination with
pytest-sugar
since they re-define a lot of hooks.
Any idea what the solution might be?
I experienced the same issue -- this behavior only occurs when run in combination with
pytest-sugar
since they re-define a lot of hooks.
(allBase39) ➜ temp pip uninstall pytest-sugar
Found existing installation: pytest-sugar 0.9.4
Uninstalling pytest-sugar-0.9.4:
Would remove:
/opt/miniconda3/envs/allBase39/lib/python3.9/site-packages/pytest_sugar-0.9.4.dist-info/*
/opt/miniconda3/envs/allBase39/lib/python3.9/site-packages/pytest_sugar.py
Proceed (Y/n)? y
Successfully uninstalled pytest-sugar-0.9.4
(allBase39) ➜ temp pytest
================================================================ test session starts =================================================================
platform darwin -- Python 3.9.7, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: /Users/xtemp/temp, configfile: pytest.ini
plugins: allure-pytest-2.9.45, html-3.1.1, metadata-1.11.0, xdist-2.4.0, order-1.0.1, timer-0.0.11, timeout-1.4.2, json-report-1.4.0
collected 10 items
testaaa.py::Testaseryhdsdaaaaa::test_adf PASSED [ 10%]
testaaa.py::Testertqwetaaaaa::test_adfs PASSED [ 20%]
testaaa.py::Testasaaaa::test_adfe PASSED [ 30%]
testaaa.py::Testbbbbbb::test_aaaaasssa PASSED [ 40%]
testaaa.py::Testbbbbbb::test_aaadaaasssammmm PASSED [ 50%]
testaaa.py::Testbbbbbb::test_aaadaaasssa3 PASSED [ 60%]
testaaa.py::Testbbbbbb::test_aaadaaasssa0 PASSED [ 70%]
testaaa.py::Testbbbbbb::test_aaaajjladaaassaaasa0 PASSED [ 80%]
testaaa.py::Testbbbbbb::test_aaadaaasadfsaaasa0 PASSED [ 90%]
testaaa.py::Testbbbbbb::test_aaadaaassaadaasa0 PASSED [100%]
------------------------------------------ generated html file: file:///Users/xtemp/temp/temp/index.html ------------------------------------------
==================================================================== pytest-timer ====================================================================
[success] 8.86% testaaa.py::Testaseryhdsdaaaaa::test_adf: 0.0007s
[success] 8.60% testaaa.py::Testertqwetaaaaa::test_adfs: 0.0007s
[success] 8.40% testaaa.py::Testbbbbbb::test_aaadaaasssa0: 0.0006s
[success] 7.67% testaaa.py::Testbbbbbb::test_aaaajjladaaassaaasa0: 0.0006s
[success] 7.13% testaaa.py::Testbbbbbb::test_aaadaaasadfsaaasa0: 0.0005s
[success] 7.06% testaaa.py::Testbbbbbb::test_aaadaaassaadaasa0: 0.0005s
[success] 7.04% testaaa.py::Testbbbbbb::test_aaadaaasssa3: 0.0005s
[success] 6.73% testaaa.py::Testbbbbbb::test_aaadaaasssammmm: 0.0005s
[success] 6.72% testaaa.py::Testasaaaa::test_adfe: 0.0005s
[success] 6.69% testaaa.py::Testbbbbbb::test_aaaaasssa: 0.0005s
[success] 3.33% testaaa.py::Testbbbbbb::test_aaadaaasssa0: 0.0003s
[success] 2.53% testaaa.py::Testertqwetaaaaa::test_adfs: 0.0002s
[success] 2.52% testaaa.py::Testbbbbbb::test_aaaajjladaaassaaasa0: 0.0002s
[success] 2.51% testaaa.py::Testaseryhdsdaaaaa::test_adf: 0.0002s
[success] 2.42% testaaa.py::Testbbbbbb::test_aaadaaasadfsaaasa0: 0.0002s
[success] 2.41% testaaa.py::Testbbbbbb::test_aaadaaassaadaasa0: 0.0002s
[success] 2.37% testaaa.py::Testbbbbbb::test_aaadaaasssa3: 0.0002s
[success] 2.37% testaaa.py::Testbbbbbb::test_aaaaasssa: 0.0002s
[success] 2.32% testaaa.py::Testasaaaa::test_adfe: 0.0002s
[success] 2.30% testaaa.py::Testbbbbbb::test_aaadaaasssammmm: 0.0002s
================================================================= 10 passed in 0.04s =================================================================
after uninstall pytest-sugar, 20 passed results still under pytest-timer scope. and still disable pytest-html plugin.
(allBase39) ➜ temp cat pytest.ini
[pytest]
xfail_strict = true
log_level = info
log_format = %(message)s
log_cli = true
log_cli_level = info
log_cli_format = %(message)s
;addopts =--html=temp/index.html --self-contained-html
python_files = test*.py
(allBase39) ➜ temp pytest
================================================================ test session starts =================================================================
platform darwin -- Python 3.9.7, pytest-6.2.5, py-1.10.0, pluggy-1.0.0
rootdir: /Users/xtemp/temp, configfile: pytest.ini
plugins: allure-pytest-2.9.45, html-3.1.1, metadata-1.11.0, xdist-2.4.0, order-1.0.1, timer-0.0.11, timeout-1.4.2, json-report-1.4.0
collected 10 items
testaaa.py::Testaseryhdsdaaaaa::test_adf PASSED [ 10%]
testaaa.py::Testertqwetaaaaa::test_adfs PASSED [ 20%]
testaaa.py::Testasaaaa::test_adfe PASSED [ 30%]
testaaa.py::Testbbbbbb::test_aaaaasssa PASSED [ 40%]
testaaa.py::Testbbbbbb::test_aaadaaasssammmm PASSED [ 50%]
testaaa.py::Testbbbbbb::test_aaadaaasssa3 PASSED [ 60%]
testaaa.py::Testbbbbbb::test_aaadaaasssa0 PASSED [ 70%]
testaaa.py::Testbbbbbb::test_aaaajjladaaassaaasa0 PASSED [ 80%]
testaaa.py::Testbbbbbb::test_aaadaaasadfsaaasa0 PASSED [ 90%]
testaaa.py::Testbbbbbb::test_aaadaaassaadaasa0 PASSED [100%]
==================================================================== pytest-timer ====================================================================
[success] 12.77% testaaa.py::Testbbbbbb::test_aaadaaasadfsaaasa0: 0.0002s
[success] 10.55% testaaa.py::Testaseryhdsdaaaaa::test_adf: 0.0002s
[success] 9.87% testaaa.py::Testbbbbbb::test_aaaajjladaaassaaasa0: 0.0001s
[success] 9.66% testaaa.py::Testasaaaa::test_adfe: 0.0001s
[success] 9.60% testaaa.py::Testbbbbbb::test_aaadaaassaadaasa0: 0.0001s
[success] 9.59% testaaa.py::Testbbbbbb::test_aaadaaasssa3: 0.0001s
[success] 9.54% testaaa.py::Testbbbbbb::test_aaaaasssa: 0.0001s
[success] 9.53% testaaa.py::Testbbbbbb::test_aaadaaasssammmm: 0.0001s
[success] 9.46% testaaa.py::Testbbbbbb::test_aaadaaasssa0: 0.0001s
[success] 9.44% testaaa.py::Testertqwetaaaaa::test_adfs: 0.0001s
================================================================= 10 passed in 0.02s =================================================================
got 10 passed results still under pytest-timer scope.
I am experiencing the double counted test problem as well. For me it seems to be an interaction betwen pytest-html
and pytest-sugar
Below did the following:
pytest
w/o html reporting generation but sugar enabled: correct count: 11pytest --html=test-results/report.html
also with sugar: incorrect count: 22pytest --html=test-results/report.html
: correct count: 11❯ pytest
Test session starts (platform: linux, Python 3.9.7, pytest 7.1.1, pytest-sugar 0.9.4)
rootdir: /home/kevbroch/rivos/it-repo-grp-wa/rv/it/int/manifest
plugins: metadata-2.0.1, sugar-0.9.4, hypothesis-6.39.4, dash-2.3.0, pymtl3-3.1.10, anyio-3.5.0, html-3.1.1
collecting ...
tests/test_manifests.py ✓✓✓✓✓✓✓✓✓✓✓ 100% ██████████
Results (0.05s):
11 passed
it/int/manifest on HEAD (9a9bb0c) via 🐍 v3.9.7 (py3venv) took 4s
❯ pytest --html=test-results/report.html
Test session starts (platform: linux, Python 3.9.7, pytest 7.1.1, pytest-sugar 0.9.4)
rootdir: /home/kevbroch/rivos/it-repo-grp-wa/rv/it/int/manifest
plugins: metadata-2.0.1, sugar-0.9.4, hypothesis-6.39.4, dash-2.3.0, pymtl3-3.1.10, anyio-3.5.0, html-3.1.1
collecting ...
tests/test_manifests.py ✓✓✓✓✓✓✓✓✓✓✓ 100% ██████████------------------------ generated html file: file:///home/kevbroch/rivos/it-repo-grp-wa/rv/it/int/manifest/test-results/report.html -------------------------
Results (0.06s):
22 passed
it/int/manifest on HEAD (9a9bb0c) via 🐍 v3.9.7 (py3venv)
❯ pip3 uninstall pytest-sugar
Found existing installation: pytest-sugar 0.9.4
Uninstalling pytest-sugar-0.9.4:
Would remove:
/home/kevbroch/py3venv/lib/python3.9/site-packages/pytest_sugar-0.9.4.dist-info/*
/home/kevbroch/py3venv/lib/python3.9/site-packages/pytest_sugar.py
Proceed (Y/n)? Y
Successfully uninstalled pytest-sugar-0.9.4
it/int/manifest on HEAD (9a9bb0c) via 🐍 v3.9.7 (py3venv) took 3s
❯ pytest --html=test-results/report.html
==================================================================== test session starts =====================================================================platform linux -- Python 3.9.7, pytest-7.1.1, pluggy-1.0.0
rootdir: /home/kevbroch/rivos/it-repo-grp-wa/rv/it/int/manifest
plugins: metadata-2.0.1, hypothesis-6.39.4, dash-2.3.0, pymtl3-3.1.10, anyio-3.5.0, html-3.1.1
collected 11 items
tests/test_manifests.py ........... [100%]
------------------------ generated html file: file:///home/kevbroch/rivos/it-repo-grp-wa/rv/it/int/manifest/test-results/report.html -------------------------
===================================================================== 11 passed in 0.05s =====================================================================
One thing I noticed is your pytest.ini has log_cli=True
. This option populates the output with seemingly double the test traffic. In reality, what is happening is the same test info gets sent to the terminal while the test is happening, and then the aggregated results get sent to the terminal again after the whole test run is complete.
What happens when you set log_cli=False
?
One thing I noticed is your pytest.ini has
log_cli=True
. This option populates the output with seemingly double the test traffic. In reality, what is happening is the same test info gets sent to the terminal while the test is happening, and then the aggregated results get sent to the terminal again after the whole test run is complete.What happens when you set
log_cli=False
?
with disable log_cli options
;log_cli = true
;log_cli_level = debug
;log_cli_format = %(message)s
and get output
============================= test session starts ==============================
platform darwin -- Python 3.9.7, pytest-6.2.5, py-1.10.0, pluggy-1.0.0 -- /opt/miniconda3/envs/allBase39/bin/python
cachedir: .pytest_cache
rootdir: /Users/xtemp/temp, configfile: pytest.ini
plugins: allure-pytest-2.9.45, html-3.1.1, metadata-1.11.0, xdist-2.4.0, order-1.0.1, timer-0.0.11, timeout-1.4.2, json-report-1.4.0
timeout: 14400.0s
timeout method: signal
timeout func_only: False
collecting ... collected 2 items
testaaa.py::Testbbbbbb::test_aaadaaasadfsaaasa0 PASSED [ 90%]
testaaa.py::Testbbbbbb::test_aaadaaassaadaasa0 PASSED [ 100%]
- generated html file: file:///Users/xtemp/temp/index.html -
--------------------------------- JSON report ----------------------------------
report saved to: temp/index.json
================================= pytest-timer =================================
[success] 26.51% testaaa.py::Testbbbbbb::test_aaadaaasadfsaaasa0: 0.0005s
[success] 26.51% testaaa.py::Testbbbbbb::test_aaadaaasadfsaaasa0: 0.0002s
[success] 23.49% testaaa.py::Testbbbbbb::test_aaadaaassaadaasa0: 0.0005s
[success] 23.49% testaaa.py::Testbbbbbb::test_aaadaaassaadaasa0: 0.0002s
============================== 2 passed in 53.70s ==============================
TEST_CASE_AMOUNT: {"skipped": 0, "passed": 0, "failed": 0, "errors": 0}
I am experiencing the double counted test problem as well. For me it seems to be an interaction betwen
pytest-html
andpytest-sugar
Below did the following:
pytest
w/o html reporting generation but sugar enabled: correct count: 11pytest --html=test-results/report.html
also with sugar: incorrect count: 22- uninstall pytest-sugar
pytest --html=test-results/report.html
: correct count: 11❯ pytest Test session starts (platform: linux, Python 3.9.7, pytest 7.1.1, pytest-sugar 0.9.4) rootdir: /home/kevbroch/rivos/it-repo-grp-wa/rv/it/int/manifest plugins: metadata-2.0.1, sugar-0.9.4, hypothesis-6.39.4, dash-2.3.0, pymtl3-3.1.10, anyio-3.5.0, html-3.1.1 collecting ... tests/test_manifests.py ✓✓✓✓✓✓✓✓✓✓✓ 100% ██████████ Results (0.05s): 11 passed it/int/manifest on HEAD (9a9bb0c) via 🐍 v3.9.7 (py3venv) took 4s ❯ pytest --html=test-results/report.html Test session starts (platform: linux, Python 3.9.7, pytest 7.1.1, pytest-sugar 0.9.4) rootdir: /home/kevbroch/rivos/it-repo-grp-wa/rv/it/int/manifest plugins: metadata-2.0.1, sugar-0.9.4, hypothesis-6.39.4, dash-2.3.0, pymtl3-3.1.10, anyio-3.5.0, html-3.1.1 collecting ... tests/test_manifests.py ✓✓✓✓✓✓✓✓✓✓✓ 100% ██████████------------------------ generated html file: file:///home/kevbroch/rivos/it-repo-grp-wa/rv/it/int/manifest/test-results/report.html ------------------------- Results (0.06s): 22 passed it/int/manifest on HEAD (9a9bb0c) via 🐍 v3.9.7 (py3venv) ❯ pip3 uninstall pytest-sugar Found existing installation: pytest-sugar 0.9.4 Uninstalling pytest-sugar-0.9.4: Would remove: /home/kevbroch/py3venv/lib/python3.9/site-packages/pytest_sugar-0.9.4.dist-info/* /home/kevbroch/py3venv/lib/python3.9/site-packages/pytest_sugar.py Proceed (Y/n)? Y Successfully uninstalled pytest-sugar-0.9.4 it/int/manifest on HEAD (9a9bb0c) via 🐍 v3.9.7 (py3venv) took 3s ❯ pytest --html=test-results/report.html ==================================================================== test session starts =====================================================================platform linux -- Python 3.9.7, pytest-7.1.1, pluggy-1.0.0 rootdir: /home/kevbroch/rivos/it-repo-grp-wa/rv/it/int/manifest plugins: metadata-2.0.1, hypothesis-6.39.4, dash-2.3.0, pymtl3-3.1.10, anyio-3.5.0, html-3.1.1 collected 11 items tests/test_manifests.py ........... [100%] ------------------------ generated html file: file:///home/kevbroch/rivos/it-repo-grp-wa/rv/it/int/manifest/test-results/report.html ------------------------- ===================================================================== 11 passed in 0.05s =====================================================================
not same as me.
Same issue for me with pytest-sugar.
Here is the summary info output I get running WITHOUT pytest-sugar:
================================================================================== short test summary info ==================================================================================
PASSED tests/test_1.py::test_a_ok
PASSED tests/test_1.py::test_g_eval_parameterized[3+5-8]
PASSED tests/test_1.py::test_g_eval_parameterized[2+4-6]
PASSED tests/test_1.py::test_1_passes_and_has_logging_output
PASSED tests/test_1.py::test_4_passes
PASSED tests/test_1.py::test_13_passes_and_has_stdout
PASSED tests/test_2.py::test_a_ok
PASSED tests/test_warnings.py::test_2_passes_with_warnings
SKIPPED [1] tests/test_1.py:35: Skipping this test with inline call to 'pytest.skip()'.
SKIPPED [1] tests/test_1.py:40: Skipping this test with inline call to 'pytest.skip()'.
SKIPPED [1] tests/test_1.py:105: unconditional skip
XFAIL tests/test_1.py::test_e1_xfail_by_inline_and_has_reason
reason: Marked as Xfail with inline call to 'pytest.xfail()'.
XFAIL tests/test_1.py::test_e2_xfail_by_decorator_and_has_reason
reason: Marked as Xfail with decorator.
XFAIL tests/test_1.py::test_f1_xfails_by_inline_even_though_assertTrue_happens_before_pytestDotXfail
reason: Marked as Xfail with inline call to 'pytest.xfail()'.
XFAIL tests/test_1.py::test_7_marked_xfail_by_decorator_and_fails_and_has_no_reason
XFAIL tests/test_xpass_xfail.py::test_xfail_by_inline
reason: xfailing this test with 'pytest.xfail()'
XFAIL tests/test_xpass_xfail.py::test_xfail_by_decorator
Here's my reason for xfail: None
XPASS tests/test_1.py::test_f2_xpass_by_xfail_decorator_and_has_reason Marked as Xfail with decorator.
XPASS tests/test_1.py::test_6_marked_xfail_by_decorator_but_passes_and_has_no_reason
ERROR tests/test_1.py::test_c_error - assert 0
ERROR tests/test_1.py::test_14_causes_error_pass_stderr_stdout_stdlog
ERROR tests/test_1.py::test_15_causes_error_fail_stderr_stdout_stdlog
ERROR tests/test_2.py::test_c_error - assert 0
FAILED tests/test_1.py::test_b_fail - assert 0
FAILED tests/test_1.py::test_g_eval_parameterized[6*9-42] - AssertionError: assert 54 == 42
FAILED tests/test_1.py::test_2_fails_and_has_logging_output - assert 0 == 1
FAILED tests/test_1.py::test_3_fails - assert 0
FAILED tests/test_1.py::test_8_causes_a_warning - TypeError: api_v1() missing 1 required positional argument: 'log_testname'
FAILED tests/test_1.py::test_9_lorem_fails - assert False
FAILED tests/test_1.py::test_10_fail_capturing - assert False
FAILED tests/test_1.py::test_11_pass_capturing - TypeError: disabled() takes 1 positional argument but 2 were given
FAILED tests/test_1.py::test_12_fails_and_has_stdout - assert 0 == 1
FAILED tests/test_1.py::test_16_fail_compare_dicts_for_pytest_icdiff - AssertionError: assert ['Hello', 'hi... 'at', 'this'] == [7, 10, 45, 23, 77]
FAILED tests/test_2.py::test_b_fail - assert 0
FAILED tests/test_warnings.py::test_1_fails_with_warnings - assert False
=================================================== 12 failed, 8 passed, 3 skipped, 6 xfailed, 2 xpassed, 22 warnings, 4 errors in 3.94s =====================
And here is the output WITH pytest-sugar:
================================================================================== short test summary info ==================================================================================
PASSED tests/test_1.py::test_a_ok
PASSED tests/test_1.py::test_a_ok
PASSED tests/test_1.py::test_a_ok
PASSED tests/test_1.py::test_b_fail
FAILED tests/test_1.py::test_b_fail
FAILED tests/test_1.py::test_c_error
PASSED tests/test_1.py::test_d1_skip_inline
SKIPPED tests/test_1.py::test_d1_skip_inline
PASSED tests/test_1.py::test_d2_skip_decorator
SKIPPED tests/test_1.py::test_d2_skip_decorator
PASSED tests/test_1.py::test_e1_xfail_by_inline_and_has_reason
xfail tests/test_1.py::test_e1_xfail_by_inline_and_has_reason
PASSED tests/test_1.py::test_e2_xfail_by_decorator_and_has_reason
xfail tests/test_1.py::test_e2_xfail_by_decorator_and_has_reason
PASSED tests/test_1.py::test_f1_xfails_by_inline_even_though_assertTrue_happens_before_pytestDotXfail
xfail tests/test_1.py::test_f1_xfails_by_inline_even_though_assertTrue_happens_before_pytestDotXfail
PASSED tests/test_1.py::test_f2_xpass_by_xfail_decorator_and_has_reason
XPASS tests/test_1.py::test_f2_xpass_by_xfail_decorator_and_has_reason
PASSED tests/test_1.py::test_g_eval_parameterized[3+5-8]
PASSED tests/test_1.py::test_g_eval_parameterized[3+5-8]
PASSED tests/test_1.py::test_g_eval_parameterized[3+5-8]
PASSED tests/test_1.py::test_g_eval_parameterized[2+4-6]
PASSED tests/test_1.py::test_g_eval_parameterized[2+4-6]
PASSED tests/test_1.py::test_g_eval_parameterized[2+4-6]
PASSED tests/test_1.py::test_g_eval_parameterized[6*9-42]
FAILED tests/test_1.py::test_g_eval_parameterized[6*9-42]
PASSED tests/test_1.py::test_1_passes_and_has_logging_output
PASSED tests/test_1.py::test_1_passes_and_has_logging_output
PASSED tests/test_1.py::test_1_passes_and_has_logging_output
PASSED tests/test_1.py::test_2_fails_and_has_logging_output
FAILED tests/test_1.py::test_2_fails_and_has_logging_output
PASSED tests/test_1.py::test_3_fails
FAILED tests/test_1.py::test_3_fails
PASSED tests/test_1.py::test_4_passes
PASSED tests/test_1.py::test_4_passes
PASSED tests/test_1.py::test_4_passes
SKIPPED tests/test_1.py::test_5_marked_SKIP
PASSED tests/test_1.py::test_6_marked_xfail_by_decorator_but_passes_and_has_no_reason
XPASS tests/test_1.py::test_6_marked_xfail_by_decorator_but_passes_and_has_no_reason
PASSED tests/test_1.py::test_7_marked_xfail_by_decorator_and_fails_and_has_no_reason
xfail tests/test_1.py::test_7_marked_xfail_by_decorator_and_fails_and_has_no_reason
PASSED tests/test_1.py::test_8_causes_a_warning
FAILED tests/test_1.py::test_8_causes_a_warning
PASSED tests/test_1.py::test_9_lorem_fails
FAILED tests/test_1.py::test_9_lorem_fails
PASSED tests/test_1.py::test_10_fail_capturing
FAILED tests/test_1.py::test_10_fail_capturing
PASSED tests/test_1.py::test_11_pass_capturing
FAILED tests/test_1.py::test_11_pass_capturing
PASSED tests/test_1.py::test_12_fails_and_has_stdout
FAILED tests/test_1.py::test_12_fails_and_has_stdout
PASSED tests/test_1.py::test_13_passes_and_has_stdout
PASSED tests/test_1.py::test_13_passes_and_has_stdout
PASSED tests/test_1.py::test_13_passes_and_has_stdout
FAILED tests/test_1.py::test_14_causes_error_pass_stderr_stdout_stdlog
FAILED tests/test_1.py::test_15_causes_error_fail_stderr_stdout_stdlog
PASSED tests/test_1.py::test_16_fail_compare_dicts_for_pytest_icdiff
FAILED tests/test_1.py::test_16_fail_compare_dicts_for_pytest_icdiff
PASSED tests/test_2.py::test_a_ok
PASSED tests/test_2.py::test_a_ok
PASSED tests/test_2.py::test_a_ok
PASSED tests/test_2.py::test_b_fail
FAILED tests/test_2.py::test_b_fail
FAILED tests/test_2.py::test_c_error
PASSED tests/test_warnings.py::test_1_fails_with_warnings
FAILED tests/test_warnings.py::test_1_fails_with_warnings
PASSED tests/test_warnings.py::test_2_passes_with_warnings
PASSED tests/test_warnings.py::test_2_passes_with_warnings
PASSED tests/test_warnings.py::test_2_passes_with_warnings
PASSED tests/test_xpass_xfail.py::test_xfail_by_inline
xfail tests/test_xpass_xfail.py::test_xfail_by_inline
PASSED tests/test_xpass_xfail.py::test_xfail_by_decorator
xfail tests/test_xpass_xfail.py::test_xfail_by_decorator
SKIPPED [1] tests/test_1.py:35: Skipping this test with inline call to 'pytest.skip()'.
SKIPPED [1] tests/test_1.py:40: Skipping this test with inline call to 'pytest.skip()'.
SKIPPED [1] tests/test_1.py:105: unconditional skip
xfail tests/test_1.py::test_e1_xfail_by_inline_and_has_reason
reason: Marked as Xfail with inline call to 'pytest.xfail()'.
xfail tests/test_1.py::test_e2_xfail_by_decorator_and_has_reason
reason: Marked as Xfail with decorator.
xfail tests/test_1.py::test_f1_xfails_by_inline_even_though_assertTrue_happens_before_pytestDotXfail
reason: Marked as Xfail with inline call to 'pytest.xfail()'.
xfail tests/test_1.py::test_7_marked_xfail_by_decorator_and_fails_and_has_no_reason
xfail tests/test_xpass_xfail.py::test_xfail_by_inline
reason: xfailing this test with 'pytest.xfail()'
xfail tests/test_xpass_xfail.py::test_xfail_by_decorator
Here's my reason for xfail: None
XPASS tests/test_1.py::test_f2_xpass_by_xfail_decorator_and_has_reason Marked as Xfail with decorator.
XPASS tests/test_1.py::test_6_marked_xfail_by_decorator_but_passes_and_has_no_reason
FAILED tests/test_1.py::test_b_fail - assert 0
FAILED tests/test_1.py::test_c_error - assert 0
FAILED tests/test_1.py::test_g_eval_parameterized[6*9-42] - AssertionError: assert 54 == 42
FAILED tests/test_1.py::test_2_fails_and_has_logging_output - assert 0 == 1
FAILED tests/test_1.py::test_3_fails - assert 0
FAILED tests/test_1.py::test_8_causes_a_warning - TypeError: api_v1() missing 1 required positional argument: 'log_testname'
FAILED tests/test_1.py::test_9_lorem_fails - assert False
FAILED tests/test_1.py::test_10_fail_capturing - assert False
FAILED tests/test_1.py::test_11_pass_capturing - TypeError: disabled() takes 1 positional argument but 2 were given
FAILED tests/test_1.py::test_12_fails_and_has_stdout - assert 0 == 1
FAILED tests/test_1.py::test_14_causes_error_pass_stderr_stdout_stdlog
FAILED tests/test_1.py::test_15_causes_error_fail_stderr_stdout_stdlog
FAILED tests/test_1.py::test_16_fail_compare_dicts_for_pytest_icdiff - AssertionError: assert ['Hello', 'hi... 'at', 'this'] == [7, 10, 45, 23, 77]
FAILED tests/test_2.py::test_b_fail - assert 0
FAILED tests/test_2.py::test_c_error - assert 0
FAILED tests/test_warnings.py::test_1_fails_with_warnings - assert False
Results (3.24s):
39 passed
2 xpassed
12 failed
- tests/test_1.py:26 test_b_fail
- tests/test_1.py:61 test_g_eval_parameterized[6*9-42]
- tests/test_1.py:88 test_2_fails_and_has_logging_output
- tests/test_1.py:97 test_3_fails
- tests/test_1.py:126 test_8_causes_a_warning
- tests/test_1.py:132 test_9_lorem_fails
- tests/test_1.py:142 test_10_fail_capturing
- tests/test_1.py:156 test_11_pass_capturing
- tests/test_1.py:175 test_12_fails_and_has_stdout
- tests/test_1.py:206 test_16_fail_compare_dicts_for_pytest_icdiff
- tests/test_2.py:26 test_b_fail
- tests/test_warnings.py:10 test_1_fails_with_warnings
4 error
6 xfailed
3 skipped
This seems to only happen on pytest-html release 3.0.0 and higher. If you downgrade to 2.1.1, the issue goes away for me.
This seems to only happen on pytest-html release 3.0.0 and higher. If you downgrade to 2.1.1, the issue goes away for me.
It’s such a weird issue.
And I’m reluctant to spend any time on it since we’re working hard on the v4 release.
v4 is basically a complete rebuild of the plugin, so with a little luck this issue won’t resurface.
v4.0.0-rc1 should be released within two weeks as we’re working on the final tweaks.
V4 FTW!! Let me know if you need help testing it.
Just checking in to see if there is an ETA on the V4 release? I'm still using V2.1.1
No ETA I’m afraid.
We discovered a major issue with v4, and addressing that has taken longer than anticipated unfortunately. 😓
My vacation starts soon, hopefully, I’ll have some time then.
Thanks for understanding.
On 30 Jun 2023 at 17:14 +0200, Andrew Copley @.***>, wrote:
Just checking in to see if there is an ETA on the V4 release? I'm still using V2.1.1 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
after change pytest.ini