pytest-dev / pytest

The pytest framework makes it easy to write small tests, yet scales to support complex functional testing
https://pytest.org
MIT License
11.9k stars 2.65k forks source link

--durations=0 reports each test twice #8042

Open fenchu opened 3 years ago

fenchu commented 3 years ago
- [ ] output of `pip list` from the virtual environment you are using

pip freeze -l astroid==2.4.2 atomicwrites==1.4.0 attrs==20.3.0 certifi==2020.11.8 cffi==1.14.3 chardet==3.0.4 colorama==0.4.4 cryptography==3.2.1 idna==2.10 iniconfig==1.1.1 isort==5.6.4 lazy-object-proxy==1.4.3 mccabe==0.6.1 packaging==20.4 pep8==1.7.1 pluggy==0.13.1 py==1.9.0 pycparser==2.20 PyJWT==1.7.1 pylint==2.6.0 pyparsing==2.4.7 pytest==6.1.2 pytest-html==3.0.0 pytest-metadata==1.10.0 requests==2.24.0 six==1.15.0 toml==0.10.2 urllib3==1.25.11 wrapt==1.12.1

- [ ] pytest and operating system versions

python.exe -c 'import pytest;print(pytest.version)' 6.1.2

Get-ComputerInfo | select OSName, OSVersion, windowsversion

OsName OsVersion WindowsVersion


Microsoft Windows 10 Enterprise 10.0.19042 2009

python.exe --version Python 3.9.0


- [ ] minimal example if possible
Test runs in containers on linux ubuntu 16.06LTS and they all show fine there. But not in windows.
symonk commented 3 years ago

I will investigate this today, bear with me

symonk commented 3 years ago

@fenchu can you share your explicit CLI args here? are you running with verbose (2) via -vv ?

I am unable to recreate:

5.01s call     test_durations_zero_no_longer_duplicates.py::test_delays[5]
4.01s call     test_durations_zero_no_longer_duplicates.py::test_delays[4]
3.01s call     test_durations_zero_no_longer_duplicates.py::test_delays[3]
2.02s call     test_durations_zero_no_longer_duplicates.py::test_delays[2]
1.01s call     test_durations_zero_no_longer_duplicates.py::test_delays[1]
0.00s setup    test_durations_zero_no_longer_duplicates.py::test_delays[1]
0.00s setup    test_durations_zero_no_longer_duplicates.py::test_delays[3]
0.00s setup    test_durations_zero_no_longer_duplicates.py::test_delays[5]
0.00s setup    test_durations_zero_no_longer_duplicates.py::test_delays[2]
0.00s setup    test_durations_zero_no_longer_duplicates.py::test_delays[4]
0.00s teardown test_durations_zero_no_longer_duplicates.py::test_delays[1]
0.00s teardown test_durations_zero_no_longer_duplicates.py::test_delays[5]
0.00s teardown test_durations_zero_no_longer_duplicates.py::test_delays[4]
0.00s teardown test_durations_zero_no_longer_duplicates.py::test_delays[3]
0.00s teardown test_durations_zero_no_longer_duplicates.py::test_delays[2]

Would you be able to provide a gist of a small recreatable example? at a glance I cannot see any plugins that may be doing additional behaviour that could conflict with the runner plugin.

0.02s call     tests/test_02_dsfmock_insert.py::TestClass::test01_gen_users
0.02s call     tests/test_02_dsfmock_insert.py::TestClass::test01_gen_users

is pretty weird tho :) looks like a bug indeed.

I tried recreating with a virtualenv of your freezeoutput, using a simple test as such:

import time
import pytest

@pytest.mark.parametrize('delay', list(range(1,6))
def test_delays(delay):
    time.sleep(delay)
    assert True

the result when running in such virtual env:

collected 5 items                                                                                                                                                                                                                         

test_this.py::test_delays[1] PASSED                                                                                                                                                                                                 [ 20%]
test_this.py::test_delays[2] PASSED                                                                                                                                                                                                 [ 40%]
test_this.py::test_delays[3] PASSED                                                                                                                                                                                                 [ 60%]
test_this.py::test_delays[4] PASSED                                                                                                                                                                                                 [ 80%]
test_this.py::test_delays[5] PASSED                                                                                                                                                                                                 [100%]

=========================================================================================================== slowest durations ============================================================================================================
5.01s call     test_this.py::test_delays[5]
4.00s call     test_this.py::test_delays[4]
3.01s call     test_this.py::test_delays[3]
2.01s call     test_this.py::test_delays[2]
1.00s call     test_this.py::test_delays[1]
0.00s setup    test_this.py::test_delays[4]
0.00s setup    test_this.py::test_delays[1]
0.00s setup    test_this.py::test_delays[2]
0.00s setup    test_this.py::test_delays[3]
0.00s setup    test_this.py::test_delays[5]
0.00s teardown test_this.py::test_delays[2]
0.00s teardown test_this.py::test_delays[1]
0.00s teardown test_this.py::test_delays[3]
0.00s teardown test_this.py::test_delays[5]
0.00s teardown test_this.py::test_delays[4]
(.venv) C:\Users\Sy\PycharmProjects\pythonProject1>pytest --version
pytest 6.1.2
(.venv) C:\Users\Sy\PycharmProjects\pythonProject1>python --version
Python 3.8.5
Windows Pro N

WIll try with a larger test set

fenchu commented 3 years ago

Hi, it has something to do with pytest-html because when I downgrade from v.3.0.0 to v2.1.1 it is all fine, I have major issues with pytest-html (no logging control, unable to delete extras, etc).

21.91s call     tests/test_04_scim_insert.py::TestClass::test01_prereg_users[test1]
13.08s call     tests/test_10_scim_delete.py::TestClass::test01_scim_delete_users[test1]
3.93s call     tests/test_01_services.py::TestClass::test01_services_up[test1]
2.82s call     tests/test_02_scim_schema_validation.py::TestClass::test01_schema_validation[test1]
2.70s call     tests/test_10_scim_delete.py::TestClass::test02_prereg_delete_users[test1]
2.47s call     tests/test_05_scim_filter.py::TestClass::test06_get_users[test1]
2.42s call     tests/test_03_dsfmock_insert.py::TestClass::test02_insert_dsfmock[test1]
2.29s call     tests/test_11_dsfmock_delete.py::TestClass::test02_delete_dsfmock[test1]
1.76s call     tests/test_05_scim_filter.py::TestClass::test08_patch_users[test1]
1.35s call     tests/test_05_scim_filter.py::TestClass::test07_put_users[test1]
1.19s call     tests/test_05_scim_filter.py::TestClass::test04_get_users_filter_externalid_sw[test1]
0.84s call     tests/test_05_scim_filter.py::TestClass::test01_scim_config[test1]
0.81s call     tests/test_05_scim_filter.py::TestClass::test05_get_users_filter_externalid_eq[test1]
0.61s call     tests/test_05_scim_filter.py::TestClass::test03_get_users_filter_username_eq[test1]
0.47s call     tests/test_05_scim_filter.py::TestClass::test02_get_users_filter_username_sw[test1]
0.18s call     tests/test_04_scim_insert.py::TestClass::test01_prereg_users_from_file
0.03s call     tests/test_03_dsfmock_insert.py::TestClass::test01_gen_users

Below are the modules that works:

ansi2html==1.6.0        
astroid==2.4.2
atomicwrites==1.4.0     
attrs==20.3.0
certifi==2020.11.8      
cffi==1.14.3
chardet==3.0.4
colorama==0.4.4
cryptography==3.2.1     
idna==2.10
iniconfig==1.1.1        
isort==5.6.4
lazy-object-proxy==1.4.3
mccabe==0.6.1
packaging==20.4
pep8==1.7.1
pluggy==0.13.1
py==1.9.0
pycparser==2.20
PyJWT==1.7.1
pylint==2.6.0
pyparsing==2.4.7
pytest==6.1.2
pytest-html==3.0.0
pytest-metadata==1.10.0
requests==2.24.0
six==1.15.0
toml==0.10.2
urllib3==1.25.11
wrapt==1.12.1

I will try and reproduce in a simple open example. but it will take some days to get the time.

Zac-HD commented 3 years ago

Hey @fenchu - just checking whether you have that reproducing example?

symonk commented 3 years ago

@fenchu Hi there, should this be an issue moved to https://github.com/pytest-dev/pytest-html/issues?

nicoddemus commented 3 years ago

Moving to pytest-dev/pytest-html would indeed be the right call, however we don't have a reproducible example, so I'm wary of doing that at this moment.

I suggest we close this for now, but once we get a reproducible example, we can reopen it and move it over pytest-dev/pytest-html.

clemente0731 commented 2 years ago

SAME ISSUE

Zac-HD commented 2 years ago

@clemente0420 as above, there's nothing we can do without a reproducible example. Can you please share a minimal repro, along with the versions of Python, pytest, and pytest-html you're using, and the exact output you see?

dbl82 commented 1 year ago

Here is a reproducible example (based on code from https://github.com/pytest-dev/pytest/issues/8741):

import pytest

pytest_plugins = ["pytester"]

@pytest.fixture
def testcontent(testdir):
    testdir.makepyfile(
        """
        import pytest
        import time

        @pytest.fixture(autouse=True, scope="session")
        def slow_start():
            time.sleep(5)

        def test_something():
            time.sleep(1)

        def test_something_else():
            time.sleep(1)
        """
    )
    yield testdir

def test_something(testcontent):
    testcontent.inline_run("--durations=10", "-vv")

def test_something_with_html_report(testcontent):
    testcontent.inline_run("--durations=10", "-vv", "--html=report/report.html")

Output:

Launching pytest with arguments /home/daryl/PycharmProjects/pytest_issue_8042/main.py in /home/daryl/PycharmProjects/pytest_issue_8042

============================= test session starts ==============================
platform linux -- Python 3.7.15, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /home/daryl/.local/share/virtualenvs/pytest_issue_8042-UW7SqagJ/bin/python
cachedir: .pytest_cache
metadata: {'Python': '3.7.15', 'Platform': 'Linux-5.10.176-2-MANJARO-x86_64-with-arch-Manjaro-Linux', 'Packages': {'pytest': '6.2.5', 'pluggy': '1.0.0'}, 'Plugins': {'html': '3.2.0', 'metadata': '2.0.4'}}
rootdir: /home/daryl/PycharmProjects/pytest_issue_8042
plugins: html-3.2.0, metadata-2.0.4
collecting ... collected 2 items

main.py::test_something PASSED                                           [ 50%]============================= test session starts ==============================
platform linux -- Python 3.7.15, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /home/daryl/.local/share/virtualenvs/pytest_issue_8042-UW7SqagJ/bin/python
cachedir: .pytest_cache
metadata: {'Python': '3.7.15', 'Platform': 'Linux-5.10.176-2-MANJARO-x86_64-with-arch-Manjaro-Linux', 'Packages': {'pytest': '6.2.5', 'pluggy': '1.0.0'}, 'Plugins': {'html': '3.2.0', 'metadata': '2.0.4'}}
rootdir: /tmp/pytest-of-daryl/pytest-12/test_something0
plugins: html-3.2.0, metadata-2.0.4
collecting ... collected 2 items

test_something.py::test_something PASSED                                 [ 50%]
test_something.py::test_something_else PASSED                            [100%]

============================= slowest 10 durations =============================
5.00s setup    test_something.py::test_something
1.00s call     test_something.py::test_something
1.00s call     test_something.py::test_something_else
0.00s teardown test_something.py::test_something
0.00s teardown test_something.py::test_something_else
0.00s setup    test_something.py::test_something_else
============================== 2 passed in 7.01s ===============================

main.py::test_something_with_html_report PASSED                          [100%]============================= test session starts ==============================
platform linux -- Python 3.7.15, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /home/daryl/.local/share/virtualenvs/pytest_issue_8042-UW7SqagJ/bin/python
cachedir: .pytest_cache
metadata: {'Python': '3.7.15', 'Platform': 'Linux-5.10.176-2-MANJARO-x86_64-with-arch-Manjaro-Linux', 'Packages': {'pytest': '6.2.5', 'pluggy': '1.0.0'}, 'Plugins': {'html': '3.2.0', 'metadata': '2.0.4'}}
rootdir: /tmp/pytest-of-daryl/pytest-12/test_something_with_html_report0
plugins: html-3.2.0, metadata-2.0.4
collecting ... collected 2 items

test_something_with_html_report.py::test_something PASSED                [ 50%]
test_something_with_html_report.py::test_something_else PASSED           [100%]

- generated html file: file:///tmp/pytest-of-daryl/pytest-12/test_something_with_html_report0/report/report.html -
============================= slowest 10 durations =============================
6.01s call     test_something_with_html_report.py::test_something
5.00s setup    test_something_with_html_report.py::test_something
1.00s call     test_something_with_html_report.py::test_something_else
1.00s call     test_something_with_html_report.py::test_something
1.00s call     test_something_with_html_report.py::test_something_else
0.00s setup    test_something_with_html_report.py::test_something_else
============================== 2 passed in 7.02s ===============================

============================== 2 passed in 14.10s ==============================

Process finished with exit code 0

Also ran it with the latest pytest:

Launching pytest with arguments /home/daryl/PycharmProjects/pytest_issue_8042/main.py in /home/daryl/PycharmProjects/pytest_issue_8042

============================= test session starts ==============================
platform linux -- Python 3.7.15, pytest-7.3.1, pluggy-1.0.0 -- /home/daryl/.local/share/virtualenvs/pytest_issue_8042-UW7SqagJ/bin/python
cachedir: .pytest_cache
metadata: {'Python': '3.7.15', 'Platform': 'Linux-5.10.176-2-MANJARO-x86_64-with-arch-Manjaro-Linux', 'Packages': {'pytest': '7.3.1', 'pluggy': '1.0.0'}, 'Plugins': {'html': '3.2.0', 'metadata': '2.0.4'}}
rootdir: /home/daryl/PycharmProjects/pytest_issue_8042
plugins: html-3.2.0, metadata-2.0.4
collecting ... collected 2 items

main.py::test_something PASSED                                           [ 50%]============================= test session starts ==============================
platform linux -- Python 3.7.15, pytest-7.3.1, pluggy-1.0.0 -- /home/daryl/.local/share/virtualenvs/pytest_issue_8042-UW7SqagJ/bin/python
cachedir: .pytest_cache
metadata: {'Python': '3.7.15', 'Platform': 'Linux-5.10.176-2-MANJARO-x86_64-with-arch-Manjaro-Linux', 'Packages': {'pytest': '7.3.1', 'pluggy': '1.0.0'}, 'Plugins': {'html': '3.2.0', 'metadata': '2.0.4'}}
rootdir: /tmp/pytest-of-daryl/pytest-13/test_something0
plugins: html-3.2.0, metadata-2.0.4
collecting ... collected 2 items

test_something.py::test_something PASSED                                 [ 50%]
test_something.py::test_something_else PASSED                            [100%]

============================= slowest 10 durations =============================
5.00s setup    test_something.py::test_something
1.00s call     test_something.py::test_something_else
1.00s call     test_something.py::test_something
0.00s setup    test_something.py::test_something_else
0.00s teardown test_something.py::test_something
0.00s teardown test_something.py::test_something_else
============================== 2 passed in 7.01s ===============================

main.py::test_something_with_html_report PASSED                          [100%]============================= test session starts ==============================
platform linux -- Python 3.7.15, pytest-7.3.1, pluggy-1.0.0 -- /home/daryl/.local/share/virtualenvs/pytest_issue_8042-UW7SqagJ/bin/python
cachedir: .pytest_cache
metadata: {'Python': '3.7.15', 'Platform': 'Linux-5.10.176-2-MANJARO-x86_64-with-arch-Manjaro-Linux', 'Packages': {'pytest': '7.3.1', 'pluggy': '1.0.0'}, 'Plugins': {'html': '3.2.0', 'metadata': '2.0.4'}}
rootdir: /tmp/pytest-of-daryl/pytest-13/test_something_with_html_report0
plugins: html-3.2.0, metadata-2.0.4
collecting ... collected 2 items

test_something_with_html_report.py::test_something PASSED                [ 50%]
test_something_with_html_report.py::test_something_else PASSED           [100%]

- generated html file: file:///tmp/pytest-of-daryl/pytest-13/test_something_with_html_report0/report/report.html -
============================= slowest 10 durations =============================
6.01s call     test_something_with_html_report.py::test_something
5.01s setup    test_something_with_html_report.py::test_something
1.00s call     test_something_with_html_report.py::test_something_else
1.00s call     test_something_with_html_report.py::test_something_else
1.00s call     test_something_with_html_report.py::test_something
0.00s setup    test_something_with_html_report.py::test_something_else
============================== 2 passed in 7.03s ===============================

============================== 2 passed in 14.12s ==============================

Process finished with exit code 0

Note that the first time there are two (0s) "teardown" times, in the second test there are extra "call" times instead. The timings reported are also different.

chipbutty commented 1 month ago

Hi, Can the durations statistics be captured in the pytest-html report or the junit-xml report? https://stackoverflow.com/questions/78827029/how-to-capture-pytest-durations-statistics-within-the-test-report

Thanks,