coherent-oss / pytest-flake8

pytest plugin to run flake8
MIT License
1 stars 1 forks source link

1.2.0: ten tests fails #2

Closed mtelka closed 3 months ago

mtelka commented 3 months ago

Tests ran using tox -e py39:

=========================== short test summary info ============================
FAILED tests/test_flake8.py::test_version - AttributeError: module 'pytest_fl...
FAILED tests/test_flake8.py::TestIgnores::test_default_flake8_ignores - Asser...
FAILED tests/test_flake8.py::TestIgnores::test_ignores_all - AssertionError: ...
FAILED tests/test_flake8.py::TestIgnores::test_w293w292 - Failed: nomatch: '*...
FAILED tests/test_flake8.py::TestIgnores::test_mtime_caching - Failed: nomatc... 
FAILED tests/test_flake8.py::test_ok_verbose - AssertionError: assert {'error...
FAILED tests/test_flake8.py::test_keyword_match - Failed: nomatch: '*E201*'
FAILED tests/test_flake8.py::test_run_on_init_file - AssertionError: assert {...
FAILED tests/test_flake8.py::test_unicode_error - AttributeError: module 'py'...
FAILED tests/test_flake8.py::test_junit_classname - AssertionError: assert {'...
======= 10 failed, 8 passed, 1 xfailed, 10 warnings in 77.11s (0:01:17) ========
mtelka commented 3 months ago

The full test log is very long (>800 kB), so here are just few starting lines:

.pkg: install_requires> python -I -m pip install 'setuptools>=61.2' 'setuptools_scm[toml]>=3.4.1'
.pkg: _optional_hooks> python /usr/lib/python3.9/vendor-packages/pyproject_api/_backend.py True setuptools.build_meta
.pkg: get_requires_for_build_editable> python /usr/lib/python3.9/vendor-packages/pyproject_api/_backend.py True setuptools.build_meta
.pkg: install_requires_for_build_editable> python -I -m pip install wheel
.pkg: build_editable> python /usr/lib/python3.9/vendor-packages/pyproject_api/_backend.py True setuptools.build_meta
py39: install_package_deps> python -I -m pip install 'flake8>=4.0' 'pytest!=8.1.*,>=6' 'pytest-checkdocs>=2.4' pytest-cov 'pytest-enabler>=2.2' pytest-mypy 'pytest>=7.0'
py39: install_package> python -I -m pip install --force-reinstall --no-deps /tmp/test/pytest_flake8-1.2.0/.tox/.tmp/package/1/pytest_flake8-1.2.0-0.editable-py3-none-any.whl
py39: commands[0]> pytest
============================= test session starts ==============================
platform sunos5 -- Python 3.9.19, pytest-8.2.2, pluggy-1.5.0
cachedir: .tox/py39/.pytest_cache
rootdir: /tmp/test/pytest_flake8-1.2.0
configfile: pytest.ini
plugins: mypy-0.10.3, enabler-3.1.1, cov-5.0.0, checkdocs-2.13.0, flake8-1.2.0
collected 19 items

docs/conf.py ..                                                          [ 10%]
. .                                                                      [ 15%]
pytest_flake8.py .                                                       [ 21%]
. .                                                                      [ 21%]
tests/test_flake8.py .F.FFFF.FFFFxF

=================================== FAILURES ===================================
_________________________________ test_version _________________________________

    def test_version():
        """Verify we can get version."""
        import pytest_flake8

>       assert pytest_flake8.__version__
E       AttributeError: module 'pytest_flake8' has no attribute '__version__'

tests/test_flake8.py:16: AttributeError
___________________ TestIgnores.test_default_flake8_ignores ____________________

self = <tests.test_flake8.TestIgnores object at 0x7fffad64e490>
testdir = <Testdir local('/tmp/pytest-of-marcel/pytest-4/test_default_flake8_ignores0')>

    def test_default_flake8_ignores(self, testdir):
        testdir.makeini("""
            [pytest]
            markers = flake8

            [flake8]
            ignore = E203
                *.py E300
                tests/*.py ALL E203  # something
        """)
        testdir.tmpdir.ensure("xy.py")
        testdir.tmpdir.ensure("tests/hello.py")
        result = testdir.runpytest("--flake8", "-s")
>       result.assert_outcomes(passed=2)
E       AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E
E         Omitting 4 identical items, use -vv to show
E         Differing items:
E         {'failed': 2} != {'failed': 0}
E         {'passed': 3} != {'passed': 2}
E         Use -v to get more diff

/tmp/test/pytest_flake8-1.2.0/tests/test_flake8.py:56: AssertionError
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform sunos5 -- Python 3.9.19, pytest-8.2.2, pluggy-1.5.0
rootdir: /tmp/pytest-of-marcel/pytest-4/test_default_flake8_ignores0
configfile: tox.ini
plugins: mypy-0.10.3, enabler-3.1.1, cov-5.0.0, checkdocs-2.13.0, flake8-1.2.0
collected 5 items

tests/hello.py F..
xy.py F.

=================================== FAILURES ===================================
_________________________________ FLAKE8-check _________________________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_and_report.<locals>.<lambda> at 0x7fffab849f70>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: Callable[[], TResult],
        when: Literal["collect", "setup", "call", "teardown"],
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        """Call func, wrapping the result in a CallInfo.

        :param func:
            The function to call. Called without arguments.
        :param when:
            The phase in which the function is called.
        :param reraise:
            Exception or exceptions that shall propagate if raised by the
            function, instead of being wrapped in the CallInfo.
        """
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/tmp/test/pytest_flake8-1.2.0/.tox/py39/lib/python3.9/site-packages/_pytest/runner.py:341:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>       lambda: runtest_hook(item=item, **kwds), when=when, reraise=reraise
    )

[...snip...]
jaraco commented 3 months ago

I get a similar set of failures if I pin to flake8<6. Probably the project should do at least that to avoid the breakage introduced in flake8 7.

jaraco commented 3 months ago

I fixed one of the test failures in 1b576d3.

jaraco commented 3 months ago

Here's the outcome I see for that test in my environment:

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> captured log >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
WARNING  flake8.checker:checker.py:105 The multiprocessing module is not available. Ignoring --jobs arguments.
WARNING  flake8.checker:checker.py:105 The multiprocessing module is not available. Ignoring --jobs arguments.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> traceback >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

self = <tests.test_flake8.TestIgnores object at 0x102d22d20>
testdir = <Testdir local('/private/var/folders/f2/2plv6q2n7l932m2x004jlw340000gn/T/pytest-of-jaraco/pytest-152/test_default_flake8_ignores0')>

    def test_default_flake8_ignores(self, testdir):
        testdir.makeini("""
            [pytest]
            markers = flake8

            [flake8]
            ignore = E203
                *.py E300
                tests/*.py ALL E203  # something
        """)
        testdir.tmpdir.ensure("xy.py")
        testdir.tmpdir.ensure("tests/hello.py")
        result = testdir.runpytest("--flake8", "-s")
>       result.assert_outcomes(passed=2)
E       AssertionError: assert {'passed': 9, 'skipped': 0, 'failed': 0, 'errors': 0, 'xpassed': 0, 'xfailed': 0} == {'passed': 2, 'skipped': 0, 'failed': 0, 'errors': 0, 'xpassed': 0, 'xfailed': 0}
E         
E         Common items:
E         {'errors': 0, 'failed': 0, 'skipped': 0, 'xfailed': 0, 'xpassed': 0}
E         Differing items:
E         {'passed': 9} != {'passed': 2}
E         
E         Full diff:
E           {
E               'errors': 0,
E               'failed': 0,
E         -     'passed': 2,
E         ?               ^
E         +     'passed': 9,
E         ?               ^
E               'skipped': 0,
E               'xfailed': 0,
E               'xpassed': 0,
E           }

/Users/jaraco/code/coherent-oss/pytest-flake8/tests/test_flake8.py:48: AssertionError

I'm thinking what's going on here is the introduction of pytest-enabler has caused the pytester to be running more plugins than it was in the previous regime.