agronholm / anyio

High level asynchronous concurrency and networking framework that works on top of either trio or asyncio
MIT License
1.76k stars 134 forks source link

3.3.1: pytest hangs in `tests/test_compat.py::TestMaybeAsync::test_cancel_scope[trio]` unit #369

Open kloczek opened 3 years ago

kloczek commented 3 years ago

trio 41.0.

+ /usr/bin/pytest -ra -p no:itsdangerous -p no:randomly -v
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/.hypothesis/examples')
rootdir: /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1, configfile: pyproject.toml, testpaths: tests
plugins: anyio-3.3.1, forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, aspectlib-1.5.2, toolbox-0.5, rerunfailures-9.1.1, requests-mock-1.9.3, cov-2.12.1, flaky-3.7.0, benchmark-3.4.1, xdist-2.3.0, pylama-7.7.1, datadir-1.3.1, regressions-2.2.0, cases-3.6.3, xprocess-0.18.1, black-0.3.12, asyncio-0.15.1, subtests-0.5.0, isort-2.0.0, hypothesis-6.14.6, mock-3.6.1, profiling-1.7.0, Faker-8.12.1, nose2pytest-1.0.8, pyfakefs-4.5.1, tornado-0.8.1, twisted-1.13.3, aiohttp-0.3.0
collected 1245 items

tests/test_compat.py::TestMaybeAsync::test_cancel_scope[asyncio] PASSED                                                                                              [  0%]
tests/test_compat.py::TestMaybeAsync::test_cancel_scope[asyncio+uvloop] PASSED                                                                                       [  0%]
tests/test_compat.py::TestMaybeAsync::test_cancel_scope[trio]

.. and that is all. ps auxwf shows only

tkloczko 2198904  1.4  0.0 6576524 115972 pts/7  S+   10:35   0:10                  \_ /usr/bin/python3 /usr/bin/pytest -ra -p no:itsdangerous -p no:randomly -v
[tkloczko@barrel SPECS]$ strace -p 2200436
strace: Process 2200436 attached
futex(0x55a89bdec820, FUTEX_WAIT_BITSET_PRIVATE|FUTEX_CLOCK_REALTIME, 0, NULL, FUTEX_BITSET_MATCH_ANY
kloczek commented 3 years ago

Looks like pytest hangs in every other *[trio] unit :/

agronholm commented 3 years ago

Why are we not seeing any hangs locally or on CI then?

agronholm commented 3 years ago

What's this about "trio 41.0"? What is that number? The latest trio release is 0.19.0. Also note that you seem to have the asyncio pytest plugin which interferes with async tests. Please only use the required test plugins as other plugins may alter the way pytest runs the test suite.

agronholm commented 3 years ago

Running pytest with -p no:asyncio may help.

kloczek commented 3 years ago

Sorry corrention trio 0.19.0 https://github.com/python-trio/trio

Running pytest with -p no:asyncio may help.

Partially .. with that one probe

+ /usr/bin/pytest -ra -p no:itsdangerous -p no:randomly -p no:asyncio -v
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/.hypothesis/examples')
rootdir: /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1, configfile: pyproject.toml, testpaths: tests
plugins: anyio-3.3.1, forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, aspectlib-1.5.2, toolbox-0.5, rerunfailures-9.1.1, requests-mock-1.9.3, cov-2.12.1, flaky-3.7.0, benchmark-3.4.1, xdist-2.3.0, pylama-7.7.1, datadir-1.3.1, regressions-2.2.0, cases-3.6.3, xprocess-0.18.1, black-0.3.12, subtests-0.5.0, isort-2.0.0, hypothesis-6.14.6, mock-3.6.1, profiling-1.7.0, Faker-8.12.1, nose2pytest-1.0.8, pyfakefs-4.5.1, tornado-0.8.1, twisted-1.13.3, aiohttp-0.3.0
collected 1245 items

tests/test_compat.py::TestMaybeAsync::test_cancel_scope[asyncio] PASSED                                                                                              [  0%]
tests/test_compat.py::TestMaybeAsync::test_cancel_scope[asyncio+uvloop] PASSED                                                                                       [  0%]
tests/test_compat.py::TestMaybeAsync::test_cancel_scope[trio] PASSED                                                                                                 [  0%]
tests/test_compat.py::TestMaybeAsync::test_current_time[asyncio] PASSED                                                                                              [  0%]
tests/test_compat.py::TestMaybeAsync::test_current_time[asyncio+uvloop] PASSED                                                                                       [  0%]
tests/test_compat.py::TestMaybeAsync::test_current_time[trio] PASSED                                                                                                 [  0%]
tests/test_compat.py::TestMaybeAsync::test_current_effective_deadline[asyncio] PASSED                                                                                [  0%]
tests/test_compat.py::TestMaybeAsync::test_current_effective_deadline[asyncio+uvloop] PASSED                                                                         [  0%]
tests/test_compat.py::TestMaybeAsync::test_current_effective_deadline[trio] PASSED                                                                                   [  0%]
tests/test_compat.py::TestMaybeAsync::test_get_running_tasks[asyncio] PASSED                                                                                         [  0%]
tests/test_compat.py::TestMaybeAsync::test_get_running_tasks[asyncio+uvloop] PASSED                                                                                  [  0%]
tests/test_compat.py::TestMaybeAsync::test_get_running_tasks[trio] PASSED                                                                                            [  0%]
tests/test_compat.py::TestMaybeAsync::test_get_current_task[asyncio] PASSED                                                                                          [  1%]
tests/test_compat.py::TestMaybeAsync::test_get_current_task[asyncio+uvloop] PASSED                                                                                   [  1%]
tests/test_compat.py::TestMaybeAsync::test_get_current_task[trio] PASSED                                                                                             [  1%]
tests/test_compat.py::test_maybe_async_cm[asyncio] PASSED                                                                                                            [  1%]
tests/test_compat.py::test_maybe_async_cm[asyncio+uvloop] PASSED                                                                                                     [  1%]
tests/test_compat.py::test_maybe_async_cm[trio] PASSED                                                                                                               [  1%]
tests/test_compat.py::TestDeprecations::test_current_effective_deadlinee[asyncio] PASSED                                                                             [  1%]
tests/test_compat.py::TestDeprecations::test_current_effective_deadlinee[asyncio+uvloop] PASSED                                                                      [  1%]
tests/test_compat.py::TestDeprecations::test_current_effective_deadlinee[trio] PASSED                                                                                [  1%]
tests/test_compat.py::TestDeprecations::test_current_time[asyncio] PASSED                                                                                            [  1%]
tests/test_compat.py::TestDeprecations::test_current_time[asyncio+uvloop] PASSED                                                                                     [  1%]
tests/test_compat.py::TestDeprecations::test_current_time[trio] PASSED                                                                                               [  1%]
tests/test_compat.py::TestDeprecations::test_get_current_task[asyncio] PASSED                                                                                        [  2%]
tests/test_compat.py::TestDeprecations::test_get_current_task[asyncio+uvloop] PASSED                                                                                 [  2%]
tests/test_compat.py::TestDeprecations::test_get_current_task[trio] PASSED                                                                                           [  2%]
tests/test_compat.py::TestDeprecations::test_running_tasks[asyncio] PASSED                                                                                           [  2%]
tests/test_compat.py::TestDeprecations::test_running_tasks[asyncio+uvloop] PASSED                                                                                    [  2%]
tests/test_compat.py::TestDeprecations::test_running_tasks[trio] PASSED                                                                                              [  2%]
tests/test_compat.py::TestDeprecations::test_open_signal_receiver[asyncio] PASSED                                                                                    [  2%]
tests/test_compat.py::TestDeprecations::test_open_signal_receiver[asyncio+uvloop] PASSED                                                                             [  2%]
tests/test_compat.py::TestDeprecations::test_open_signal_receiver[trio] PASSED                                                                                       [  2%]
tests/test_compat.py::TestDeprecations::test_cancelscope_cancel[asyncio] PASSED                                                                                      [  2%]
tests/test_compat.py::TestDeprecations::test_cancelscope_cancel[asyncio+uvloop] PASSED                                                                               [  2%]
tests/test_compat.py::TestDeprecations::test_cancelscope_cancel[trio] PASSED                                                                                         [  2%]
tests/test_compat.py::TestDeprecations::test_taskgroup_cancel[asyncio] PASSED                                                                                        [  2%]
tests/test_compat.py::TestDeprecations::test_taskgroup_cancel[asyncio+uvloop] PASSED                                                                                 [  3%]
tests/test_compat.py::TestDeprecations::test_taskgroup_cancel[trio] PASSED                                                                                           [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_acquire_nowait[asyncio] PASSED                                                                          [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_acquire_nowait[asyncio+uvloop] PASSED                                                                   [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_acquire_nowait[trio] PASSED                                                                             [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_acquire_on_behalf_of_nowait[asyncio] PASSED                                                             [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_acquire_on_behalf_of_nowait[asyncio+uvloop] PASSED                                                      [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_acquire_on_behalf_of_nowait[trio] PASSED                                                                [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_set_total_tokens[asyncio] PASSED                                                                        [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_set_total_tokens[asyncio+uvloop] PASSED                                                                 [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_set_total_tokens[trio] PASSED                                                                           [  3%]
tests/test_compat.py::TestDeprecations::test_condition_release[asyncio] PASSED                                                                                       [  3%]
tests/test_compat.py::TestDeprecations::test_condition_release[asyncio+uvloop] PASSED                                                                                [  4%]
tests/test_compat.py::TestDeprecations::test_condition_release[trio] PASSED                                                                                          [  4%]
tests/test_compat.py::TestDeprecations::test_event_set[asyncio] PASSED                                                                                               [  4%]
tests/test_compat.py::TestDeprecations::test_event_set[asyncio+uvloop] PASSED                                                                                        [  4%]
tests/test_compat.py::TestDeprecations::test_event_set[trio] PASSED                                                                                                  [  4%]
tests/test_compat.py::TestDeprecations::test_lock_release[asyncio] PASSED                                                                                            [  4%]
tests/test_compat.py::TestDeprecations::test_lock_release[asyncio+uvloop] PASSED                                                                                     [  4%]
tests/test_compat.py::TestDeprecations::test_lock_release[trio] PASSED                                                                                               [  4%]
tests/test_compat.py::TestDeprecations::test_memory_object_stream_send_nowait[asyncio] PASSED                                                                        [  4%]
tests/test_compat.py::TestDeprecations::test_memory_object_stream_send_nowait[asyncio+uvloop] PASSED                                                                 [  4%]
tests/test_compat.py::TestDeprecations::test_memory_object_stream_send_nowait[trio] PASSED                                                                           [  4%]
tests/test_compat.py::TestDeprecations::test_semaphore_release[asyncio] PASSED                                                                                       [  4%]
tests/test_compat.py::TestDeprecations::test_semaphore_release[asyncio+uvloop] PASSED                                                                                [  4%]
tests/test_compat.py::TestDeprecations::test_semaphore_release[trio] PASSED                                                                                          [  5%]
tests/test_compat.py::TestDeprecations::test_move_on_after[asyncio] PASSED                                                                                           [  5%]
tests/test_compat.py::TestDeprecations::test_move_on_after[asyncio+uvloop] PASSED                                                                                    [  5%]
tests/test_compat.py::TestDeprecations::test_move_on_after[trio] PASSED                                                                                              [  5%]
tests/test_compat.py::TestDeprecations::test_fail_after[asyncio] PASSED                                                                                              [  5%]
tests/test_compat.py::TestDeprecations::test_fail_after[asyncio+uvloop] PASSED                                                                                       [  5%]
tests/test_compat.py::TestDeprecations::test_fail_after[trio] PASSED                                                                                                 [  5%]
tests/test_compat.py::TestDeprecations::test_run_sync_in_worker_thread[asyncio] PASSED                                                                               [  5%]
tests/test_compat.py::TestDeprecations::test_run_sync_in_worker_thread[asyncio+uvloop] PASSED                                                                        [  5%]
tests/test_compat.py::TestDeprecations::test_run_sync_in_worker_thread[trio] PASSED                                                                                  [  5%]
tests/test_compat.py::TestDeprecations::test_run_async_from_thread[asyncio] PASSED                                                                                   [  5%]
tests/test_compat.py::TestDeprecations::test_run_async_from_thread[asyncio+uvloop] PASSED                                                                            [  5%]
tests/test_compat.py::TestDeprecations::test_run_async_from_thread[trio] PASSED                                                                                      [  6%]
tests/test_compat.py::TestDeprecations::test_run_sync_from_thread[asyncio] PASSED                                                                                    [  6%]
tests/test_compat.py::TestDeprecations::test_run_sync_from_thread[asyncio+uvloop] PASSED                                                                             [  6%]
tests/test_compat.py::TestDeprecations::test_run_sync_from_thread[trio] PASSED                                                                                       [  6%]
tests/test_compat.py::TestDeprecations::test_current_default_worker_thread_limiter[asyncio] PASSED                                                                   [  6%]
tests/test_compat.py::TestDeprecations::test_current_default_worker_thread_limiter[asyncio+uvloop] PASSED                                                            [  6%]
tests/test_compat.py::TestDeprecations::test_current_default_worker_thread_limiter[trio] PASSED                                                                      [  6%]
tests/test_compat.py::TestDeprecations::test_create_blocking_portal[asyncio] PASSED                                                                                  [  6%]
tests/test_compat.py::TestDeprecations::test_create_blocking_portal[asyncio+uvloop] PASSED                                                                           [  6%]
tests/test_compat.py::TestDeprecations::test_create_blocking_portal[trio] PASSED                                                                                     [  6%]
tests/test_compat.py::TestPickle::test_deprecated_awaitable_none PASSED                                                                                              [  6%]
tests/test_compat.py::TestPickle::test_deprecated_awaitable_float PASSED                                                                                             [  6%]
tests/test_compat.py::TestPickle::test_deprecated_awaitable_list PASSED                                                                                              [  6%]
tests/test_debugging.py::test_main_task_name[asyncio] PASSED                                                                                                         [  7%]
tests/test_debugging.py::test_main_task_name[asyncio+uvloop] PASSED                                                                                                  [  7%]
tests/test_debugging.py::test_main_task_name[trio] FAILED                                                                                                            [  7%]
tests/test_debugging.py::test_non_main_task_name[asyncio-None-tests.test_debugging.test_non_main_task_name.<locals>.non_main] PASSED                                 [  7%]
tests/test_debugging.py::test_non_main_task_name[asyncio-name-b'name'] PASSED                                                                                        [  7%]
tests/test_debugging.py::test_non_main_task_name[asyncio-name-name] PASSED                                                                                           [  7%]
tests/test_debugging.py::test_non_main_task_name[asyncio--] PASSED                                                                                                   [  7%]
tests/test_debugging.py::test_non_main_task_name[asyncio+uvloop-None-tests.test_debugging.test_non_main_task_name.<locals>.non_main] PASSED                          [  7%]
tests/test_debugging.py::test_non_main_task_name[asyncio+uvloop-name-b'name'] PASSED                                                                                 [  7%]
tests/test_debugging.py::test_non_main_task_name[asyncio+uvloop-name-name] PASSED                                                                                    [  7%]
tests/test_debugging.py::test_non_main_task_name[asyncio+uvloop--] PASSED                                                                                            [  7%]
tests/test_debugging.py::test_non_main_task_name[trio-None-tests.test_debugging.test_non_main_task_name.<locals>.non_main]

So I;ve started adding moreunits to deselct list

+ /usr/bin/pytest -ra -p no:itsdangerous -p no:randomly -p no:asyncio -vvv --deselect 'tests/test_debugging.py::test_non_main_task_name[trio-None-tests.test_debugging.test_non_main_task_name.<locals>.non_main]' --deselect 'tests/test_debugging.py::test_non_main_task_name[trio-name-b'\''name'\'']' --deselect 'tests/test_debugging.py::test_non_main_task_name[trio-name-name]' --deselect 'tests/test_debugging.py::test_non_main_task_name[trio--]' --deselect 'tests/test_debugging.py::test_get_running_tasks[trio]'
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/.hypothesis/examples')
rootdir: /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1, configfile: pyproject.toml, testpaths: tests
plugins: anyio-3.3.1, forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, aspectlib-1.5.2, toolbox-0.5, rerunfailures-9.1.1, requests-mock-1.9.3, cov-2.12.1, flaky-3.7.0, benchmark-3.4.1, xdist-2.3.0, pylama-7.7.1, datadir-1.3.1, regressions-2.2.0, cases-3.6.3, xprocess-0.18.1, black-0.3.12, subtests-0.5.0, isort-2.0.0, hypothesis-6.14.6, mock-3.6.1, profiling-1.7.0, Faker-8.12.1, nose2pytest-1.0.8, pyfakefs-4.5.1, tornado-0.8.1, twisted-1.13.3, aiohttp-0.3.0
collected 1245 items / 5 deselected / 1240 selected

tests/test_compat.py::TestMaybeAsync::test_cancel_scope[asyncio] PASSED                                                                                              [  0%]
tests/test_compat.py::TestMaybeAsync::test_cancel_scope[asyncio+uvloop] PASSED                                                                                       [  0%]
tests/test_compat.py::TestMaybeAsync::test_cancel_scope[trio] PASSED                                                                                                 [  0%]
tests/test_compat.py::TestMaybeAsync::test_current_time[asyncio] PASSED                                                                                              [  0%]
tests/test_compat.py::TestMaybeAsync::test_current_time[asyncio+uvloop] PASSED                                                                                       [  0%]
tests/test_compat.py::TestMaybeAsync::test_current_time[trio] PASSED                                                                                                 [  0%]
tests/test_compat.py::TestMaybeAsync::test_current_effective_deadline[asyncio] PASSED                                                                                [  0%]
tests/test_compat.py::TestMaybeAsync::test_current_effective_deadline[asyncio+uvloop] PASSED                                                                         [  0%]
tests/test_compat.py::TestMaybeAsync::test_current_effective_deadline[trio] PASSED                                                                                   [  0%]
tests/test_compat.py::TestMaybeAsync::test_get_running_tasks[asyncio] PASSED                                                                                         [  0%]
tests/test_compat.py::TestMaybeAsync::test_get_running_tasks[asyncio+uvloop] PASSED                                                                                  [  0%]
tests/test_compat.py::TestMaybeAsync::test_get_running_tasks[trio] PASSED                                                                                            [  0%]
tests/test_compat.py::TestMaybeAsync::test_get_current_task[asyncio] PASSED                                                                                          [  1%]
tests/test_compat.py::TestMaybeAsync::test_get_current_task[asyncio+uvloop] PASSED                                                                                   [  1%]
tests/test_compat.py::TestMaybeAsync::test_get_current_task[trio] PASSED                                                                                             [  1%]
tests/test_compat.py::test_maybe_async_cm[asyncio] PASSED                                                                                                            [  1%]
tests/test_compat.py::test_maybe_async_cm[asyncio+uvloop] PASSED                                                                                                     [  1%]
tests/test_compat.py::test_maybe_async_cm[trio] PASSED                                                                                                               [  1%]
tests/test_compat.py::TestDeprecations::test_current_effective_deadlinee[asyncio] PASSED                                                                             [  1%]
tests/test_compat.py::TestDeprecations::test_current_effective_deadlinee[asyncio+uvloop] PASSED                                                                      [  1%]
tests/test_compat.py::TestDeprecations::test_current_effective_deadlinee[trio] PASSED                                                                                [  1%]
tests/test_compat.py::TestDeprecations::test_current_time[asyncio] PASSED                                                                                            [  1%]
tests/test_compat.py::TestDeprecations::test_current_time[asyncio+uvloop] PASSED                                                                                     [  1%]
tests/test_compat.py::TestDeprecations::test_current_time[trio] PASSED                                                                                               [  1%]
tests/test_compat.py::TestDeprecations::test_get_current_task[asyncio] PASSED                                                                                        [  2%]
tests/test_compat.py::TestDeprecations::test_get_current_task[asyncio+uvloop] PASSED                                                                                 [  2%]
tests/test_compat.py::TestDeprecations::test_get_current_task[trio] PASSED                                                                                           [  2%]
tests/test_compat.py::TestDeprecations::test_running_tasks[asyncio] PASSED                                                                                           [  2%]
tests/test_compat.py::TestDeprecations::test_running_tasks[asyncio+uvloop] PASSED                                                                                    [  2%]
tests/test_compat.py::TestDeprecations::test_running_tasks[trio] PASSED                                                                                              [  2%]
tests/test_compat.py::TestDeprecations::test_open_signal_receiver[asyncio] PASSED                                                                                    [  2%]
tests/test_compat.py::TestDeprecations::test_open_signal_receiver[asyncio+uvloop] PASSED                                                                             [  2%]
tests/test_compat.py::TestDeprecations::test_open_signal_receiver[trio] PASSED                                                                                       [  2%]
tests/test_compat.py::TestDeprecations::test_cancelscope_cancel[asyncio] PASSED                                                                                      [  2%]
tests/test_compat.py::TestDeprecations::test_cancelscope_cancel[asyncio+uvloop] PASSED                                                                               [  2%]
tests/test_compat.py::TestDeprecations::test_cancelscope_cancel[trio] PASSED                                                                                         [  2%]
tests/test_compat.py::TestDeprecations::test_taskgroup_cancel[asyncio] PASSED                                                                                        [  2%]
tests/test_compat.py::TestDeprecations::test_taskgroup_cancel[asyncio+uvloop] PASSED                                                                                 [  3%]
tests/test_compat.py::TestDeprecations::test_taskgroup_cancel[trio] PASSED                                                                                           [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_acquire_nowait[asyncio] PASSED                                                                          [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_acquire_nowait[asyncio+uvloop] PASSED                                                                   [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_acquire_nowait[trio] PASSED                                                                             [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_acquire_on_behalf_of_nowait[asyncio] PASSED                                                             [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_acquire_on_behalf_of_nowait[asyncio+uvloop] PASSED                                                      [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_acquire_on_behalf_of_nowait[trio] PASSED                                                                [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_set_total_tokens[asyncio] PASSED                                                                        [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_set_total_tokens[asyncio+uvloop] PASSED                                                                 [  3%]
tests/test_compat.py::TestDeprecations::test_capacitylimiter_set_total_tokens[trio] PASSED                                                                           [  3%]
tests/test_compat.py::TestDeprecations::test_condition_release[asyncio] PASSED                                                                                       [  3%]
tests/test_compat.py::TestDeprecations::test_condition_release[asyncio+uvloop] PASSED                                                                                [  4%]
tests/test_compat.py::TestDeprecations::test_condition_release[trio] PASSED                                                                                          [  4%]
tests/test_compat.py::TestDeprecations::test_event_set[asyncio] PASSED                                                                                               [  4%]
tests/test_compat.py::TestDeprecations::test_event_set[asyncio+uvloop] PASSED                                                                                        [  4%]
tests/test_compat.py::TestDeprecations::test_event_set[trio] PASSED                                                                                                  [  4%]
tests/test_compat.py::TestDeprecations::test_lock_release[asyncio] PASSED                                                                                            [  4%]
tests/test_compat.py::TestDeprecations::test_lock_release[asyncio+uvloop] PASSED                                                                                     [  4%]
tests/test_compat.py::TestDeprecations::test_lock_release[trio] PASSED                                                                                               [  4%]
tests/test_compat.py::TestDeprecations::test_memory_object_stream_send_nowait[asyncio] PASSED                                                                        [  4%]
tests/test_compat.py::TestDeprecations::test_memory_object_stream_send_nowait[asyncio+uvloop] PASSED                                                                 [  4%]
tests/test_compat.py::TestDeprecations::test_memory_object_stream_send_nowait[trio] PASSED                                                                           [  4%]
tests/test_compat.py::TestDeprecations::test_semaphore_release[asyncio] PASSED                                                                                       [  4%]
tests/test_compat.py::TestDeprecations::test_semaphore_release[asyncio+uvloop] PASSED                                                                                [  5%]
tests/test_compat.py::TestDeprecations::test_semaphore_release[trio] PASSED                                                                                          [  5%]
tests/test_compat.py::TestDeprecations::test_move_on_after[asyncio] PASSED                                                                                           [  5%]
tests/test_compat.py::TestDeprecations::test_move_on_after[asyncio+uvloop] PASSED                                                                                    [  5%]
tests/test_compat.py::TestDeprecations::test_move_on_after[trio] PASSED                                                                                              [  5%]
tests/test_compat.py::TestDeprecations::test_fail_after[asyncio] PASSED                                                                                              [  5%]
tests/test_compat.py::TestDeprecations::test_fail_after[asyncio+uvloop] PASSED                                                                                       [  5%]
tests/test_compat.py::TestDeprecations::test_fail_after[trio] PASSED                                                                                                 [  5%]
tests/test_compat.py::TestDeprecations::test_run_sync_in_worker_thread[asyncio] PASSED                                                                               [  5%]
tests/test_compat.py::TestDeprecations::test_run_sync_in_worker_thread[asyncio+uvloop] PASSED                                                                        [  5%]
tests/test_compat.py::TestDeprecations::test_run_sync_in_worker_thread[trio] PASSED                                                                                  [  5%]
tests/test_compat.py::TestDeprecations::test_run_async_from_thread[asyncio] PASSED                                                                                   [  5%]
tests/test_compat.py::TestDeprecations::test_run_async_from_thread[asyncio+uvloop] PASSED                                                                            [  5%]
tests/test_compat.py::TestDeprecations::test_run_async_from_thread[trio] PASSED                                                                                      [  6%]
tests/test_compat.py::TestDeprecations::test_run_sync_from_thread[asyncio] PASSED                                                                                    [  6%]
tests/test_compat.py::TestDeprecations::test_run_sync_from_thread[asyncio+uvloop] PASSED                                                                             [  6%]
tests/test_compat.py::TestDeprecations::test_run_sync_from_thread[trio] PASSED                                                                                       [  6%]
tests/test_compat.py::TestDeprecations::test_current_default_worker_thread_limiter[asyncio] PASSED                                                                   [  6%]
tests/test_compat.py::TestDeprecations::test_current_default_worker_thread_limiter[asyncio+uvloop] PASSED                                                            [  6%]
tests/test_compat.py::TestDeprecations::test_current_default_worker_thread_limiter[trio] PASSED                                                                      [  6%]
tests/test_compat.py::TestDeprecations::test_create_blocking_portal[asyncio] PASSED                                                                                  [  6%]
tests/test_compat.py::TestDeprecations::test_create_blocking_portal[asyncio+uvloop] PASSED                                                                           [  6%]
tests/test_compat.py::TestDeprecations::test_create_blocking_portal[trio] PASSED                                                                                     [  6%]
tests/test_compat.py::TestPickle::test_deprecated_awaitable_none PASSED                                                                                              [  6%]
tests/test_compat.py::TestPickle::test_deprecated_awaitable_float PASSED                                                                                             [  6%]
tests/test_compat.py::TestPickle::test_deprecated_awaitable_list PASSED                                                                                              [  7%]
tests/test_debugging.py::test_main_task_name[asyncio] PASSED                                                                                                         [  7%]
tests/test_debugging.py::test_main_task_name[asyncio+uvloop] PASSED                                                                                                  [  7%]
tests/test_debugging.py::test_main_task_name[trio] FAILED                                                                                                            [  7%]
tests/test_debugging.py::test_non_main_task_name[asyncio-None-tests.test_debugging.test_non_main_task_name.<locals>.non_main] PASSED                                 [  7%]
tests/test_debugging.py::test_non_main_task_name[asyncio-name-b'name'] PASSED                                                                                        [  7%]
tests/test_debugging.py::test_non_main_task_name[asyncio-name-name] PASSED                                                                                           [  7%]
tests/test_debugging.py::test_non_main_task_name[asyncio--] PASSED                                                                                                   [  7%]
tests/test_debugging.py::test_non_main_task_name[asyncio+uvloop-None-tests.test_debugging.test_non_main_task_name.<locals>.non_main] PASSED                          [  7%]
tests/test_debugging.py::test_non_main_task_name[asyncio+uvloop-name-b'name'] PASSED                                                                                 [  7%]
tests/test_debugging.py::test_non_main_task_name[asyncio+uvloop-name-name] PASSED                                                                                    [  7%]
tests/test_debugging.py::test_non_main_task_name[asyncio+uvloop--] PASSED                                                                                            [  7%]
tests/test_debugging.py::test_get_running_tasks[asyncio] PASSED                                                                                                      [  7%]
tests/test_debugging.py::test_get_running_tasks[asyncio+uvloop] PASSED                                                                                               [  8%]
tests/test_debugging.py::test_wait_generator_based_task_blocked PASSED                                                                                               [  8%]
tests/test_debugging.py::test_wait_all_tasks_blocked_asend[asyncio] PASSED                                                                                           [  8%]
tests/test_debugging.py::test_wait_all_tasks_blocked_cancelled_task[asyncio] PASSED                                                                                  [  8%]
tests/test_debugging.py::test_wait_all_tasks_blocked_cancelled_task[asyncio+uvloop] PASSED                                                                           [  8%]
tests/test_debugging.py::test_wait_all_tasks_blocked_cancelled_task[trio]

and so on ..

agronholm commented 3 years ago

Well, what are the failures then?

kloczek commented 3 years ago

Cannot see that because still some trio units are hanging :/

agronholm commented 3 years ago

This is why you need an isolated test environment (virtualenv, docker, chroot, what have you). Otherwise you'll be perpetually trying to fix breaking tests when you don't know which plugins are affecting the test run and how.

kloczek commented 3 years ago

Issue is that it may as well mean that anyio will work on ly iin such isolated env ..

agronholm commented 3 years ago

No, only the tests need such an environment.

agronholm commented 3 years ago

Try setting the environment variable PYTEST_DISABLE_PLUGIN_AUTOLOAD and then explicitly using -p anyio and see what happens. It might just work.

kloczek commented 3 years ago

If what you wrote would be true it would mean that test units are for pytest and not to test over pytest modules code :)

agronholm commented 3 years ago

The problem here is pytest picking up arbitrary plugins from the environment and automatically activating them, with unknown effects. If that can be disabled then the tests are expected to pass.

kloczek commented 3 years ago
+ /usr/bin/pytest -ra -p asyncio
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1
rootdir: /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1, configfile: pyproject.toml, testpaths: tests
plugins: asyncio-0.15.1
collected 6 items / 19 errors

================================================================================== ERRORS ==================================================================================
__________________________________________________________________ ERROR collecting tests/test_compat.py ___________________________________________________________________
'anyio' not found in `markers` configuration option
_________________________________________________________________ ERROR collecting tests/test_debugging.py _________________________________________________________________
'anyio' not found in `markers` configuration option
_________________________________________________________________ ERROR collecting tests/test_eventloop.py _________________________________________________________________
'anyio' not found in `markers` configuration option
__________________________________________________________________ ERROR collecting tests/test_fileio.py ___________________________________________________________________
'anyio' not found in `markers` configuration option
________________________________________________________________ ERROR collecting tests/test_from_thread.py ________________________________________________________________
'anyio' not found in `markers` configuration option
_________________________________________________________________ ERROR collecting tests/test_lowlevel.py __________________________________________________________________
'anyio' not found in `markers` configuration option
__________________________________________________________________ ERROR collecting tests/test_signals.py __________________________________________________________________
'anyio' not found in `markers` configuration option
__________________________________________________________________ ERROR collecting tests/test_sockets.py __________________________________________________________________
'anyio' not found in `markers` configuration option
_______________________________________________________________ ERROR collecting tests/test_subprocesses.py ________________________________________________________________
'anyio' not found in `markers` configuration option
______________________________________________________________ ERROR collecting tests/test_synchronization.py ______________________________________________________________
'anyio' not found in `markers` configuration option
________________________________________________________________ ERROR collecting tests/test_taskgroups.py _________________________________________________________________
'anyio' not found in `markers` configuration option
________________________________________________________________ ERROR collecting tests/test_to_process.py _________________________________________________________________
'anyio' not found in `markers` configuration option
_________________________________________________________________ ERROR collecting tests/test_to_thread.py _________________________________________________________________
'anyio' not found in `markers` configuration option
_____________________________________________________________ ERROR collecting tests/streams/test_buffered.py ______________________________________________________________
'anyio' not found in `markers` configuration option
_______________________________________________________________ ERROR collecting tests/streams/test_file.py ________________________________________________________________
'anyio' not found in `markers` configuration option
______________________________________________________________ ERROR collecting tests/streams/test_memory.py _______________________________________________________________
'anyio' not found in `markers` configuration option
______________________________________________________________ ERROR collecting tests/streams/test_stapled.py ______________________________________________________________
'anyio' not found in `markers` configuration option
_______________________________________________________________ ERROR collecting tests/streams/test_text.py ________________________________________________________________
'anyio' not found in `markers` configuration option
________________________________________________________________ ERROR collecting tests/streams/test_tls.py ________________________________________________________________
'anyio' not found in `markers` configuration option
========================================================================= short test summary info ==========================================================================
ERROR tests/test_compat.py
ERROR tests/test_debugging.py
ERROR tests/test_eventloop.py
ERROR tests/test_fileio.py
ERROR tests/test_from_thread.py
ERROR tests/test_lowlevel.py
ERROR tests/test_signals.py
ERROR tests/test_sockets.py
ERROR tests/test_subprocesses.py
ERROR tests/test_synchronization.py
ERROR tests/test_taskgroups.py
ERROR tests/test_to_process.py
ERROR tests/test_to_thread.py
ERROR tests/streams/test_buffered.py
ERROR tests/streams/test_file.py
ERROR tests/streams/test_memory.py
ERROR tests/streams/test_stapled.py
ERROR tests/streams/test_text.py
ERROR tests/streams/test_tls.py
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 19 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
============================================================================ 19 errors in 0.76s ============================================================================
agronholm commented 3 years ago

You enabled the asyncio plugin; I told you to enable the anyio one.

kloczek commented 3 years ago

Sorry corrected plugin name

+ /usr/bin/pytest -ra -p anyio
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1
rootdir: /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1, configfile: pyproject.toml, testpaths: tests
plugins: anyio-3.3.1
collected 1245 items

tests/test_compat.py .......................................................................................                                                         [  6%]
tests/test_debugging.py .......................                                                                                                                      [  8%]
tests/test_eventloop.py EEEEEEEEE                                                                                                                                    [  9%]
tests/test_fileio.py .........................s...........s...........................................................sss........................................... [ 21%]
....................                                                                                                                                                 [ 22%]
tests/test_from_thread.py .............................................................................                                                              [ 28%]
tests/test_lowlevel.py ...........................                                                                                                                   [ 31%]
tests/test_pytest_plugin.py FFFFFF                                                                                                                                   [ 31%]
tests/test_signals.py .........                                                                                                                                      [ 32%]
tests/test_sockets.py .............................................................................................................................................. [ 43%]
.................................................................................................................................................................... [ 56%]
.......................                                                                                                                                              [ 58%]
tests/test_subprocesses.py ..................                                                                                                                        [ 60%]
tests/test_synchronization.py ...................................................................................................                                    [ 68%]
tests/test_taskgroups.py ........................................................................................................................................... [ 79%]
.....................................s                                                                                                                               [ 82%]
tests/test_to_process.py .....................                                                                                                                       [ 83%]
tests/test_to_thread.py ........................                                                                                                                     [ 85%]
tests/streams/test_buffered.py ............                                                                                                                          [ 86%]
tests/streams/test_file.py ..............................                                                                                                            [ 89%]
tests/streams/test_memory.py .................................................................                                                                       [ 94%]
tests/streams/test_stapled.py ..................                                                                                                                     [ 95%]
tests/streams/test_text.py ...............                                                                                                                           [ 97%]
tests/streams/test_tls.py ....................................                                                                                                       [100%]

================================================================================== ERRORS ==================================================================================
_______________________________________________________________ ERROR at setup of test_sleep_until[asyncio] ________________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 24
  async def test_sleep_until(fake_sleep: AsyncMock) -> None:
      deadline = fake_current_time + 500.102352
      await sleep_until(deadline)
      fake_sleep.assert_called_once_with(deadline - fake_current_time)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
____________________________________________________________ ERROR at setup of test_sleep_until[asyncio+uvloop] ____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 24
  async def test_sleep_until(fake_sleep: AsyncMock) -> None:
      deadline = fake_current_time + 500.102352
      await sleep_until(deadline)
      fake_sleep.assert_called_once_with(deadline - fake_current_time)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
_________________________________________________________________ ERROR at setup of test_sleep_until[trio] _________________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 24
  async def test_sleep_until(fake_sleep: AsyncMock) -> None:
      deadline = fake_current_time + 500.102352
      await sleep_until(deadline)
      fake_sleep.assert_called_once_with(deadline - fake_current_time)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
___________________________________________________________ ERROR at setup of test_sleep_until_in_past[asyncio] ____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 30
  async def test_sleep_until_in_past(fake_sleep: AsyncMock) -> None:
      deadline = fake_current_time - 500.102352
      await sleep_until(deadline)
      fake_sleep.assert_called_once_with(0)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
________________________________________________________ ERROR at setup of test_sleep_until_in_past[asyncio+uvloop] ________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 30
  async def test_sleep_until_in_past(fake_sleep: AsyncMock) -> None:
      deadline = fake_current_time - 500.102352
      await sleep_until(deadline)
      fake_sleep.assert_called_once_with(0)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
_____________________________________________________________ ERROR at setup of test_sleep_until_in_past[trio] _____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 30
  async def test_sleep_until_in_past(fake_sleep: AsyncMock) -> None:
      deadline = fake_current_time - 500.102352
      await sleep_until(deadline)
      fake_sleep.assert_called_once_with(0)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
______________________________________________________________ ERROR at setup of test_sleep_forever[asyncio] _______________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 36
  async def test_sleep_forever(fake_sleep: AsyncMock) -> None:
      await sleep_forever()
      fake_sleep.assert_called_once_with(math.inf)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
___________________________________________________________ ERROR at setup of test_sleep_forever[asyncio+uvloop] ___________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 36
  async def test_sleep_forever(fake_sleep: AsyncMock) -> None:
      await sleep_forever()
      fake_sleep.assert_called_once_with(math.inf)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
________________________________________________________________ ERROR at setup of test_sleep_forever[trio] ________________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 36
  async def test_sleep_forever(fake_sleep: AsyncMock) -> None:
      await sleep_forever()
      fake_sleep.assert_called_once_with(math.inf)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
================================================================================= FAILURES =================================================================================
_______________________________________________________________________________ test_plugin ________________________________________________________________________________
/usr/lib/python3.8/site-packages/_pytest/runner.py:311: in from_call
    result: Optional[TResult] = func()
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/unraisableexception.py:88: in pytest_runtest_call
    yield from unraisable_exception_runtest_hook()
/usr/lib/python3.8/site-packages/_pytest/unraisableexception.py:78: in unraisable_exception_runtest_hook
    warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))
E   pytest.PytestUnraisableExceptionWarning: Exception ignored in: <coroutine object async_fixture at 0x7f792cd19d40>
E
E   Traceback (most recent call last):
E     File "/usr/lib64/python3.8/warnings.py", line 506, in _warn_unawaited_coroutine
E       warn(msg, category=RuntimeWarning, stacklevel=2, source=coro)
E   RuntimeWarning: coroutine 'async_fixture' was never awaited
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-150/test_plugin0
collecting ... collected 4 items

test_plugin.py::test_marked_test FAILED                                  [ 25%]
test_plugin.py::test_async_fixture_from_marked_test FAILED               [ 50%]
test_plugin.py::test_async_fixture_from_sync_test ERROR                  [ 75%]
test_plugin.py::test_skip_inline FAILED                                  [100%]

==================================== ERRORS ====================================
_____________ ERROR at setup of test_async_fixture_from_sync_test ______________
file /tmp/pytest-of-tkloczko/pytest-150/test_plugin0/test_plugin.py, line 20
  def test_async_fixture_from_sync_test(anyio_backend_name, async_fixture):
E       fixture 'anyio_backend_name' not found
>       available fixtures: async_fixture, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, some_feature, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/tmp/pytest-of-tkloczko/pytest-150/test_plugin0/test_plugin.py:20
=================================== FAILURES ===================================
_______________________________ test_marked_test _______________________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f7ad01630d0>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_plugin.py::test_marked_test'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
_____________________ test_async_fixture_from_marked_test ______________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f7ad0163280>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_plugin.py::test_async_fixture_from_marked_test'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
_______________________________ test_skip_inline _______________________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f792cd1c700>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_plugin.py::test_skip_inline'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
=========================== short test summary info ============================
FAILED test_plugin.py::test_marked_test - pytest.PytestUnhandledCoroutineWarn...
FAILED test_plugin.py::test_async_fixture_from_marked_test - pytest.PytestUnh...
FAILED test_plugin.py::test_skip_inline - pytest.PytestUnhandledCoroutineWarn...
ERROR test_plugin.py::test_async_fixture_from_sync_test
========================== 3 failed, 1 error in 1.32s ==========================
_______________________________________________________________________________ test_asyncio _______________________________________________________________________________
/usr/lib/python3.8/site-packages/_pytest/runner.py:311: in from_call
    result: Optional[TResult] = func()
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/unraisableexception.py:88: in pytest_runtest_call
    yield from unraisable_exception_runtest_hook()
/usr/lib/python3.8/site-packages/_pytest/unraisableexception.py:78: in unraisable_exception_runtest_hook
    warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))
E   pytest.PytestUnraisableExceptionWarning: Exception ignored in: <coroutine object TestClassFixtures.async_class_fixture at 0x7f792cb84040>
E
E   Traceback (most recent call last):
E     File "/usr/lib64/python3.8/warnings.py", line 506, in _warn_unawaited_coroutine
E       warn(msg, category=RuntimeWarning, stacklevel=2, source=coro)
E   RuntimeWarning: coroutine 'TestClassFixtures.async_class_fixture' was never awaited
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-150/test_asyncio0
collecting ... collected 4 items

test_asyncio.py::TestClassFixtures::test_class_fixture_in_test_method ERROR [ 25%]
test_asyncio.py::test_callback_exception_during_test FAILED              [ 50%]
test_asyncio.py::test_callback_exception_during_setup FAILED             [ 75%]
test_asyncio.py::test_callback_exception_during_teardown FAILED          [100%]

==================================== ERRORS ====================================
____ ERROR at setup of TestClassFixtures.test_class_fixture_in_test_method _____
file /tmp/pytest-of-tkloczko/pytest-150/test_asyncio0/test_asyncio.py, line 14
      def test_class_fixture_in_test_method(self, async_class_fixture, anyio_backend_name):
E       fixture 'anyio_backend_name' not found
>       available fixtures: anyio_backend, async_class_fixture, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, setup_fail_fixture, teardown_fail_fixture, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/tmp/pytest-of-tkloczko/pytest-150/test_asyncio0/test_asyncio.py:14
=================================== FAILURES ===================================
_____________________ test_callback_exception_during_test ______________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f792cbbcee0>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_asyncio.py::test_callback_exception_during_test'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
_____________________ test_callback_exception_during_setup _____________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f792cdca5e0>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_asyncio.py::test_callback_exception_during_setup'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
___________________ test_callback_exception_during_teardown ____________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f7ad002ca60>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_asyncio.py::test_callback_exception_during_teardown'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
=========================== short test summary info ============================
FAILED test_asyncio.py::test_callback_exception_during_test - pytest.PytestUn...
FAILED test_asyncio.py::test_callback_exception_during_setup - pytest.PytestU...
FAILED test_asyncio.py::test_callback_exception_during_teardown - pytest.Pyte...
ERROR test_asyncio.py::TestClassFixtures::test_class_fixture_in_test_method
========================== 3 failed, 1 error in 0.37s ==========================
________________________________________________________________________ test_autouse_async_fixture ________________________________________________________________________
/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_pytest_plugin.py:175: in test_autouse_async_fixture
    result.assert_outcomes(passed=len(get_all_backends()))
E   AssertionError: assert {'errors': 1,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E     Omitting 4 identical items, use -vv to show
E     Differing items:
E     {'passed': 0} != {'passed': 2}
E     {'errors': 1} != {'errors': 0}
E     Use -v to get the full diff
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-150/test_autouse_async_fixture0
collecting ... collected 1 item

test_autouse_async_fixture.py::test_autouse_backend ERROR                [100%]

==================================== ERRORS ====================================
____________________ ERROR at setup of test_autouse_backend ____________________
file /tmp/pytest-of-tkloczko/pytest-150/test_autouse_async_fixture0/test_autouse_async_fixture.py, line 7
  def test_autouse_backend(autouse_backend_name):
file /tmp/pytest-of-tkloczko/pytest-150/test_autouse_async_fixture0/conftest.py, line 6
  @pytest.fixture(autouse=True)
  async def autouse_async_fixture(anyio_backend_name):
      global autouse_backend
      autouse_backend = anyio_backend_name
E       fixture 'anyio_backend_name' not found
>       available fixtures: autouse_async_fixture, autouse_backend_name, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/tmp/pytest-of-tkloczko/pytest-150/test_autouse_async_fixture0/conftest.py:6
=========================== short test summary info ============================
ERROR test_autouse_async_fixture.py::test_autouse_backend
=============================== 1 error in 0.01s ===============================
__________________________________________________________________ test_cancel_scope_in_asyncgen_fixture ___________________________________________________________________
/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_pytest_plugin.py:202: in test_cancel_scope_in_asyncgen_fixture
    result.assert_outcomes(passed=len(get_all_backends()))
E   AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E     Omitting 4 identical items, use -vv to show
E     Differing items:
E     {'passed': 0} != {'passed': 2}
E     {'failed': 1} != {'failed': 0}
E     Use -v to get the full diff
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-150/test_cancel_scope_in_asyncgen_fixture0
collecting ... collected 1 item

test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture FAILED [100%]

=================================== FAILURES ===================================
_______________________ test_cancel_in_asyncgen_fixture ________________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f792c8c7f70>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
=========================== short test summary info ============================
FAILED test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture
============================== 1 failed in 0.15s ===============================
_______________________________________________________________________ test_hypothesis_module_mark ________________________________________________________________________
/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_pytest_plugin.py:233: in test_hypothesis_module_mark
    result.assert_outcomes(passed=len(get_all_backends()) + 1, xfailed=len(get_all_backends()))
E   AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E     Omitting 3 identical items, use -vv to show
E     Differing items:
E     {'failed': 1} != {'failed': 0}
E     {'passed': 1} != {'passed': 3}
E     {'xfailed': 1} != {'xfailed': 2}
E     Use -v to get the full diff
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-150/test_hypothesis_module_mark0
collecting ... collected 3 items

test_hypothesis_module_mark.py::test_hypothesis_wrapper FAILED           [ 33%]
test_hypothesis_module_mark.py::test_hypothesis_wrapper_regular PASSED   [ 66%]
test_hypothesis_module_mark.py::test_hypothesis_wrapper_failing XFAIL    [100%]

=================================== FAILURES ===================================
___________________________ test_hypothesis_wrapper ____________________________

    @given(x=just(1))
>   async def test_hypothesis_wrapper(x):

test_hypothesis_module_mark.py:9:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <hypothesis.core.StateForActualGivenExecution object at 0x7f792c9af2e0>
data = ConjectureData(VALID, 0 bytes, frozen)

    def _execute_once_for_engine(self, data):
        """Wrapper around ``execute_once`` that intercepts test failure
        exceptions and single-test control exceptions, and turns them into
        appropriate method calls to `data` instead.

        This allows the engine to assume that any exception other than
        ``StopTest`` must be a fatal error, and should stop the entire engine.
        """
        try:
            trace = frozenset()
            if (
                self.failed_normally
                and not self.failed_due_to_deadline
                and Phase.shrink in self.settings.phases
                and Phase.explain in self.settings.phases
                and sys.gettrace() is None
                and not PYPY
            ):  # pragma: no cover
                # This is in fact covered by our *non-coverage* tests, but due to the
                # settrace() contention *not* by our coverage tests.  Ah well.
                tracer = Tracer()
                try:
                    sys.settrace(tracer.trace)
                    result = self.execute_once(data)
                    if data.status == Status.VALID:
                        self.explain_traces[None].add(frozenset(tracer.branches))
                finally:
                    sys.settrace(None)
                    trace = frozenset(tracer.branches)
            else:
                result = self.execute_once(data)
            if result is not None:
>               fail_health_check(
                    self.settings,
                    "Tests run under @given should return None, but "
                    f"{self.test.__name__} returned {result!r} instead.",
                    HealthCheck.return_value,
                )
E               hypothesis.errors.FailedHealthCheck: Tests run under @given should return None, but test_hypothesis_wrapper returned <coroutine object test_hypothesis_wrapper at 0x7f792cb5db40> instead.
E               See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.return_value to the suppress_health_check settings for this test.

/usr/lib/python3.8/site-packages/hypothesis/core.py:692: FailedHealthCheck
----------------------------- Captured stdout call -----------------------------
You can add @seed(180267515461561116554664385941329166602) to this test to reproduce this failure.
=========================== short test summary info ============================
FAILED test_hypothesis_module_mark.py::test_hypothesis_wrapper - hypothesis.e...
==================== 1 failed, 1 passed, 1 xfailed in 0.20s ====================
______________________________________________________________________ test_hypothesis_function_mark _______________________________________________________________________
/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_pytest_plugin.py:272: in test_hypothesis_function_mark
    result.assert_outcomes(passed=2 * len(get_all_backends()), xfailed=2 * len(get_all_backends()))
E   AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E     Omitting 3 identical items, use -vv to show
E     Differing items:
E     {'failed': 2} != {'failed': 0}
E     {'passed': 0} != {'passed': 4}
E     {'xfailed': 2} != {'xfailed': 4}
E     Use -v to get the full diff
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-150/test_hypothesis_function_mark0
collecting ... collected 4 items

test_hypothesis_function_mark.py::test_anyio_mark_first FAILED           [ 25%]
test_hypothesis_function_mark.py::test_anyio_mark_last FAILED            [ 50%]
test_hypothesis_function_mark.py::test_anyio_mark_first_fail XFAIL       [ 75%]
test_hypothesis_function_mark.py::test_anyio_mark_last_fail XFAIL        [100%]

=================================== FAILURES ===================================
____________________________ test_anyio_mark_first _____________________________

    @pytest.mark.anyio
>   @given(x=just(1))

test_hypothesis_function_mark.py:7:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <hypothesis.core.StateForActualGivenExecution object at 0x7f792c64eeb0>
data = ConjectureData(VALID, 0 bytes, frozen)

    def _execute_once_for_engine(self, data):
        """Wrapper around ``execute_once`` that intercepts test failure
        exceptions and single-test control exceptions, and turns them into
        appropriate method calls to `data` instead.

        This allows the engine to assume that any exception other than
        ``StopTest`` must be a fatal error, and should stop the entire engine.
        """
        try:
            trace = frozenset()
            if (
                self.failed_normally
                and not self.failed_due_to_deadline
                and Phase.shrink in self.settings.phases
                and Phase.explain in self.settings.phases
                and sys.gettrace() is None
                and not PYPY
            ):  # pragma: no cover
                # This is in fact covered by our *non-coverage* tests, but due to the
                # settrace() contention *not* by our coverage tests.  Ah well.
                tracer = Tracer()
                try:
                    sys.settrace(tracer.trace)
                    result = self.execute_once(data)
                    if data.status == Status.VALID:
                        self.explain_traces[None].add(frozenset(tracer.branches))
                finally:
                    sys.settrace(None)
                    trace = frozenset(tracer.branches)
            else:
                result = self.execute_once(data)
            if result is not None:
>               fail_health_check(
                    self.settings,
                    "Tests run under @given should return None, but "
                    f"{self.test.__name__} returned {result!r} instead.",
                    HealthCheck.return_value,
                )
E               hypothesis.errors.FailedHealthCheck: Tests run under @given should return None, but test_anyio_mark_first returned <coroutine object test_anyio_mark_first at 0x7f792c6fab40> instead.
E               See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.return_value to the suppress_health_check settings for this test.

/usr/lib/python3.8/site-packages/hypothesis/core.py:692: FailedHealthCheck
----------------------------- Captured stdout call -----------------------------
You can add @seed(318154536163484788587391827459419431167) to this test to reproduce this failure.
_____________________________ test_anyio_mark_last _____________________________

>   ???

test_hypothesis_function_mark.py:13:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <hypothesis.core.StateForActualGivenExecution object at 0x7f792c691760>
data = ConjectureData(VALID, 0 bytes, frozen)

    def _execute_once_for_engine(self, data):
        """Wrapper around ``execute_once`` that intercepts test failure
        exceptions and single-test control exceptions, and turns them into
        appropriate method calls to `data` instead.

        This allows the engine to assume that any exception other than
        ``StopTest`` must be a fatal error, and should stop the entire engine.
        """
        try:
            trace = frozenset()
            if (
                self.failed_normally
                and not self.failed_due_to_deadline
                and Phase.shrink in self.settings.phases
                and Phase.explain in self.settings.phases
                and sys.gettrace() is None
                and not PYPY
            ):  # pragma: no cover
                # This is in fact covered by our *non-coverage* tests, but due to the
                # settrace() contention *not* by our coverage tests.  Ah well.
                tracer = Tracer()
                try:
                    sys.settrace(tracer.trace)
                    result = self.execute_once(data)
                    if data.status == Status.VALID:
                        self.explain_traces[None].add(frozenset(tracer.branches))
                finally:
                    sys.settrace(None)
                    trace = frozenset(tracer.branches)
            else:
                result = self.execute_once(data)
            if result is not None:
>               fail_health_check(
                    self.settings,
                    "Tests run under @given should return None, but "
                    f"{self.test.__name__} returned {result!r} instead.",
                    HealthCheck.return_value,
                )
E               hypothesis.errors.FailedHealthCheck: Tests run under @given should return None, but test_anyio_mark_last returned <coroutine object test_anyio_mark_last at 0x7f792c7a6bc0> instead.
E               See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.return_value to the suppress_health_check settings for this test.

/usr/lib/python3.8/site-packages/hypothesis/core.py:692: FailedHealthCheck
----------------------------- Captured stdout call -----------------------------
You can add @seed(38977849223032161486997914589834192227) to this test to reproduce this failure.
=========================== short test summary info ============================
FAILED test_hypothesis_function_mark.py::test_anyio_mark_first - hypothesis.e...
FAILED test_hypothesis_function_mark.py::test_anyio_mark_last - hypothesis.er...
========================= 2 failed, 2 xfailed in 0.28s =========================
========================================================================= short test summary info ==========================================================================
SKIPPED [1] tests/test_fileio.py:119: Drive only makes sense on Windows
SKIPPED [1] tests/test_fileio.py:159: Only makes sense on Windows
SKIPPED [3] tests/test_fileio.py:318: os.lchmod() is not available
SKIPPED [1] tests/test_taskgroups.py:967: Cancel messages are only supported on py3.9+
ERROR tests/test_eventloop.py::test_sleep_until[asyncio]
ERROR tests/test_eventloop.py::test_sleep_until[asyncio+uvloop]
ERROR tests/test_eventloop.py::test_sleep_until[trio]
ERROR tests/test_eventloop.py::test_sleep_until_in_past[asyncio]
ERROR tests/test_eventloop.py::test_sleep_until_in_past[asyncio+uvloop]
ERROR tests/test_eventloop.py::test_sleep_until_in_past[trio]
ERROR tests/test_eventloop.py::test_sleep_forever[asyncio]
ERROR tests/test_eventloop.py::test_sleep_forever[asyncio+uvloop]
ERROR tests/test_eventloop.py::test_sleep_forever[trio]
FAILED tests/test_pytest_plugin.py::test_plugin - pytest.PytestUnraisableExceptionWarning: Exception ignored in: <coroutine object async_fixture at 0x7f792cd19d40>
FAILED tests/test_pytest_plugin.py::test_asyncio - pytest.PytestUnraisableExceptionWarning: Exception ignored in: <coroutine object TestClassFixtures.async_class_fixture...
FAILED tests/test_pytest_plugin.py::test_autouse_async_fixture - AssertionError: assert {'errors': 1,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_pytest_plugin.py::test_cancel_scope_in_asyncgen_fixture - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_pytest_plugin.py::test_hypothesis_module_mark - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_pytest_plugin.py::test_hypothesis_function_mark - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
=========================================================== 6 failed, 1224 passed, 6 skipped, 9 errors in 40.50s ===========================================================
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-221c7167-e68c-4227-8638-5e758b04f0a7/test_rmtree_errorhandler_reado0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_rmtree_errorhandler_reado0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-221c7167-e68c-4227-8638-5e758b04f0a7/test_rmtree_errorhandler_rerai0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_rmtree_errorhandler_rerai0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-221c7167-e68c-4227-8638-5e758b04f0a7/test_safe_set_no_perms0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_safe_set_no_perms0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-221c7167-e68c-4227-8638-5e758b04f0a7/test_safe_delete_no_perms0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_safe_delete_no_perms0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-221c7167-e68c-4227-8638-5e758b04f0a7/test_safe_get_no_perms0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_safe_get_no_perms0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-221c7167-e68c-4227-8638-5e758b04f0a7
<class 'OSError'>: [Errno 39] Directory not empty: '/tmp/pytest-of-tkloczko/garbage-221c7167-e68c-4227-8638-5e758b04f0a7'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-76963890-50ad-4a9d-9925-4d9c381d39d4/test_rmtree_errorhandler_reado0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_rmtree_errorhandler_reado0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-76963890-50ad-4a9d-9925-4d9c381d39d4/test_rmtree_errorhandler_rerai0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_rmtree_errorhandler_rerai0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-76963890-50ad-4a9d-9925-4d9c381d39d4/test_safe_get_no_perms0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_safe_get_no_perms0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-76963890-50ad-4a9d-9925-4d9c381d39d4/test_safe_set_no_perms0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_safe_set_no_perms0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-76963890-50ad-4a9d-9925-4d9c381d39d4/test_safe_delete_no_perms0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_safe_delete_no_perms0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-76963890-50ad-4a9d-9925-4d9c381d39d4
<class 'OSError'>: [Errno 39] Directory not empty: '/tmp/pytest-of-tkloczko/garbage-76963890-50ad-4a9d-9925-4d9c381d39d4'
  warnings.warn(
agronholm commented 3 years ago

Ok, this is progress. To get rid of the other test failures, the mock and hypothesis plugins should be enabled.

kloczek commented 3 years ago

This is with hypothesis 6.14.6

kloczek commented 3 years ago
+ /usr/bin/pytest -ra -p anyio -p hypothesis -p mock
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1
rootdir: /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1, configfile: pyproject.toml, testpaths: tests
plugins: anyio-3.3.1
collected 1245 items

tests/test_compat.py .......................................................................................                                                         [  6%]
tests/test_debugging.py .......................                                                                                                                      [  8%]
tests/test_eventloop.py EEEEEEEEE                                                                                                                                    [  9%]
tests/test_fileio.py .........................s...........s...........................................................sss........................................... [ 21%]
....................                                                                                                                                                 [ 22%]
tests/test_from_thread.py .............................................................................                                                              [ 28%]
tests/test_lowlevel.py ...........................                                                                                                                   [ 31%]
tests/test_pytest_plugin.py FFFFFF                                                                                                                                   [ 31%]
tests/test_signals.py .........                                                                                                                                      [ 32%]
tests/test_sockets.py .............................................................................................................................................. [ 43%]
.................................................................................................................................................................... [ 56%]
.......................                                                                                                                                              [ 58%]
tests/test_subprocesses.py ..................                                                                                                                        [ 60%]
tests/test_synchronization.py ...................................................................................................                                    [ 68%]
tests/test_taskgroups.py ........................................................................................................................................... [ 79%]
.....................................s                                                                                                                               [ 82%]
tests/test_to_process.py .....................                                                                                                                       [ 83%]
tests/test_to_thread.py ........................                                                                                                                     [ 85%]
tests/streams/test_buffered.py ............                                                                                                                          [ 86%]
tests/streams/test_file.py ..............................                                                                                                            [ 89%]
tests/streams/test_memory.py .................................................................                                                                       [ 94%]
tests/streams/test_stapled.py ..................                                                                                                                     [ 95%]
tests/streams/test_text.py ...............                                                                                                                           [ 97%]
tests/streams/test_tls.py ....................................                                                                                                       [100%]

================================================================================== ERRORS ==================================================================================
_______________________________________________________________ ERROR at setup of test_sleep_until[asyncio] ________________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 24
  async def test_sleep_until(fake_sleep: AsyncMock) -> None:
      deadline = fake_current_time + 500.102352
      await sleep_until(deadline)
      fake_sleep.assert_called_once_with(deadline - fake_current_time)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
____________________________________________________________ ERROR at setup of test_sleep_until[asyncio+uvloop] ____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 24
  async def test_sleep_until(fake_sleep: AsyncMock) -> None:
      deadline = fake_current_time + 500.102352
      await sleep_until(deadline)
      fake_sleep.assert_called_once_with(deadline - fake_current_time)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
_________________________________________________________________ ERROR at setup of test_sleep_until[trio] _________________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 24
  async def test_sleep_until(fake_sleep: AsyncMock) -> None:
      deadline = fake_current_time + 500.102352
      await sleep_until(deadline)
      fake_sleep.assert_called_once_with(deadline - fake_current_time)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
___________________________________________________________ ERROR at setup of test_sleep_until_in_past[asyncio] ____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 30
  async def test_sleep_until_in_past(fake_sleep: AsyncMock) -> None:
      deadline = fake_current_time - 500.102352
      await sleep_until(deadline)
      fake_sleep.assert_called_once_with(0)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
________________________________________________________ ERROR at setup of test_sleep_until_in_past[asyncio+uvloop] ________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 30
  async def test_sleep_until_in_past(fake_sleep: AsyncMock) -> None:
      deadline = fake_current_time - 500.102352
      await sleep_until(deadline)
      fake_sleep.assert_called_once_with(0)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
_____________________________________________________________ ERROR at setup of test_sleep_until_in_past[trio] _____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 30
  async def test_sleep_until_in_past(fake_sleep: AsyncMock) -> None:
      deadline = fake_current_time - 500.102352
      await sleep_until(deadline)
      fake_sleep.assert_called_once_with(0)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
______________________________________________________________ ERROR at setup of test_sleep_forever[asyncio] _______________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 36
  async def test_sleep_forever(fake_sleep: AsyncMock) -> None:
      await sleep_forever()
      fake_sleep.assert_called_once_with(math.inf)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
___________________________________________________________ ERROR at setup of test_sleep_forever[asyncio+uvloop] ___________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 36
  async def test_sleep_forever(fake_sleep: AsyncMock) -> None:
      await sleep_forever()
      fake_sleep.assert_called_once_with(math.inf)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
________________________________________________________________ ERROR at setup of test_sleep_forever[trio] ________________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 36
  async def test_sleep_forever(fake_sleep: AsyncMock) -> None:
      await sleep_forever()
      fake_sleep.assert_called_once_with(math.inf)
file /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py, line 18
  @pytest.fixture
  def fake_sleep(mocker: MockerFixture) -> AsyncMock:
E       fixture 'mocker' not found
>       available fixtures: LineMatcher, _config_for_test, _pytest, _sys_snapshot, anyio_backend, anyio_backend_name, anyio_backend_options, asyncio_event_loop, ca, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, client_context, doctest_namespace, fake_sleep, linecomp, monkeypatch, pytestconfig, pytester, record_property, record_testsuite_property, record_xml_attribute, recwarn, server_context, testdir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_eventloop.py:18
================================================================================= FAILURES =================================================================================
_______________________________________________________________________________ test_plugin ________________________________________________________________________________
/usr/lib/python3.8/site-packages/_pytest/runner.py:311: in from_call
    result: Optional[TResult] = func()
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/unraisableexception.py:88: in pytest_runtest_call
    yield from unraisable_exception_runtest_hook()
/usr/lib/python3.8/site-packages/_pytest/unraisableexception.py:78: in unraisable_exception_runtest_hook
    warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))
E   pytest.PytestUnraisableExceptionWarning: Exception ignored in: <coroutine object async_fixture at 0x7efaf0dd7240>
E
E   Traceback (most recent call last):
E     File "/usr/lib64/python3.8/warnings.py", line 506, in _warn_unawaited_coroutine
E       warn(msg, category=RuntimeWarning, stacklevel=2, source=coro)
E   RuntimeWarning: coroutine 'async_fixture' was never awaited
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-152/test_plugin0
collecting ... collected 4 items

test_plugin.py::test_marked_test FAILED                                  [ 25%]
test_plugin.py::test_async_fixture_from_marked_test FAILED               [ 50%]
test_plugin.py::test_async_fixture_from_sync_test ERROR                  [ 75%]
test_plugin.py::test_skip_inline FAILED                                  [100%]

==================================== ERRORS ====================================
_____________ ERROR at setup of test_async_fixture_from_sync_test ______________
file /tmp/pytest-of-tkloczko/pytest-152/test_plugin0/test_plugin.py, line 20
  def test_async_fixture_from_sync_test(anyio_backend_name, async_fixture):
E       fixture 'anyio_backend_name' not found
>       available fixtures: async_fixture, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, some_feature, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/tmp/pytest-of-tkloczko/pytest-152/test_plugin0/test_plugin.py:20
=================================== FAILURES ===================================
_______________________________ test_marked_test _______________________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7efaf0eced30>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_plugin.py::test_marked_test'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
_____________________ test_async_fixture_from_marked_test ______________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7efaf16ee940>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_plugin.py::test_async_fixture_from_marked_test'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
_______________________________ test_skip_inline _______________________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7efaf0e53e50>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_plugin.py::test_skip_inline'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
=========================== short test summary info ============================
FAILED test_plugin.py::test_marked_test - pytest.PytestUnhandledCoroutineWarn...
FAILED test_plugin.py::test_async_fixture_from_marked_test - pytest.PytestUnh...
FAILED test_plugin.py::test_skip_inline - pytest.PytestUnhandledCoroutineWarn...
ERROR test_plugin.py::test_async_fixture_from_sync_test
========================== 3 failed, 1 error in 0.58s ==========================
_______________________________________________________________________________ test_asyncio _______________________________________________________________________________
/usr/lib/python3.8/site-packages/_pytest/runner.py:311: in from_call
    result: Optional[TResult] = func()
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/unraisableexception.py:88: in pytest_runtest_call
    yield from unraisable_exception_runtest_hook()
/usr/lib/python3.8/site-packages/_pytest/unraisableexception.py:78: in unraisable_exception_runtest_hook
    warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))
E   pytest.PytestUnraisableExceptionWarning: Exception ignored in: <coroutine object TestClassFixtures.async_class_fixture at 0x7efaf0c99040>
E
E   Traceback (most recent call last):
E     File "/usr/lib64/python3.8/warnings.py", line 506, in _warn_unawaited_coroutine
E       warn(msg, category=RuntimeWarning, stacklevel=2, source=coro)
E   RuntimeWarning: coroutine 'TestClassFixtures.async_class_fixture' was never awaited
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-152/test_asyncio0
collecting ... collected 4 items

test_asyncio.py::TestClassFixtures::test_class_fixture_in_test_method ERROR [ 25%]
test_asyncio.py::test_callback_exception_during_test FAILED              [ 50%]
test_asyncio.py::test_callback_exception_during_setup FAILED             [ 75%]
test_asyncio.py::test_callback_exception_during_teardown FAILED          [100%]

==================================== ERRORS ====================================
____ ERROR at setup of TestClassFixtures.test_class_fixture_in_test_method _____
file /tmp/pytest-of-tkloczko/pytest-152/test_asyncio0/test_asyncio.py, line 14
      def test_class_fixture_in_test_method(self, async_class_fixture, anyio_backend_name):
E       fixture 'anyio_backend_name' not found
>       available fixtures: anyio_backend, async_class_fixture, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, setup_fail_fixture, teardown_fail_fixture, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/tmp/pytest-of-tkloczko/pytest-152/test_asyncio0/test_asyncio.py:14
=================================== FAILURES ===================================
_____________________ test_callback_exception_during_test ______________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7efaf0cb5ca0>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_asyncio.py::test_callback_exception_during_test'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
_____________________ test_callback_exception_during_setup _____________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7efaf0b815e0>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_asyncio.py::test_callback_exception_during_setup'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
___________________ test_callback_exception_during_teardown ____________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7efaf0d36700>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_asyncio.py::test_callback_exception_during_teardown'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
=========================== short test summary info ============================
FAILED test_asyncio.py::test_callback_exception_during_test - pytest.PytestUn...
FAILED test_asyncio.py::test_callback_exception_during_setup - pytest.PytestU...
FAILED test_asyncio.py::test_callback_exception_during_teardown - pytest.Pyte...
ERROR test_asyncio.py::TestClassFixtures::test_class_fixture_in_test_method
========================== 3 failed, 1 error in 0.49s ==========================
________________________________________________________________________ test_autouse_async_fixture ________________________________________________________________________
/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_pytest_plugin.py:175: in test_autouse_async_fixture
    result.assert_outcomes(passed=len(get_all_backends()))
E   AssertionError: assert {'errors': 1,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E     Omitting 4 identical items, use -vv to show
E     Differing items:
E     {'passed': 0} != {'passed': 2}
E     {'errors': 1} != {'errors': 0}
E     Use -v to get the full diff
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-152/test_autouse_async_fixture0
collecting ... collected 1 item

test_autouse_async_fixture.py::test_autouse_backend ERROR                [100%]

==================================== ERRORS ====================================
____________________ ERROR at setup of test_autouse_backend ____________________
file /tmp/pytest-of-tkloczko/pytest-152/test_autouse_async_fixture0/test_autouse_async_fixture.py, line 7
  def test_autouse_backend(autouse_backend_name):
file /tmp/pytest-of-tkloczko/pytest-152/test_autouse_async_fixture0/conftest.py, line 6
  @pytest.fixture(autouse=True)
  async def autouse_async_fixture(anyio_backend_name):
      global autouse_backend
      autouse_backend = anyio_backend_name
E       fixture 'anyio_backend_name' not found
>       available fixtures: autouse_async_fixture, autouse_backend_name, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/tmp/pytest-of-tkloczko/pytest-152/test_autouse_async_fixture0/conftest.py:6
=========================== short test summary info ============================
ERROR test_autouse_async_fixture.py::test_autouse_backend
=============================== 1 error in 0.01s ===============================
__________________________________________________________________ test_cancel_scope_in_asyncgen_fixture ___________________________________________________________________
/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_pytest_plugin.py:202: in test_cancel_scope_in_asyncgen_fixture
    result.assert_outcomes(passed=len(get_all_backends()))
E   AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E     Omitting 4 identical items, use -vv to show
E     Differing items:
E     {'failed': 1} != {'failed': 0}
E     {'passed': 0} != {'passed': 2}
E     Use -v to get the full diff
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-152/test_cancel_scope_in_asyncgen_fixture0
collecting ... collected 1 item

test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture FAILED [100%]

=================================== FAILURES ===================================
_______________________ test_cancel_in_asyncgen_fixture ________________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7efaf0dcf9d0>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
=========================== short test summary info ============================
FAILED test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture
============================== 1 failed in 0.12s ===============================
_______________________________________________________________________ test_hypothesis_module_mark ________________________________________________________________________
/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_pytest_plugin.py:233: in test_hypothesis_module_mark
    result.assert_outcomes(passed=len(get_all_backends()) + 1, xfailed=len(get_all_backends()))
E   AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E     Omitting 3 identical items, use -vv to show
E     Differing items:
E     {'failed': 1} != {'failed': 0}
E     {'xfailed': 1} != {'xfailed': 2}
E     {'passed': 1} != {'passed': 3}
E     Use -v to get the full diff
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-152/test_hypothesis_module_mark0
collecting ... collected 3 items

test_hypothesis_module_mark.py::test_hypothesis_wrapper FAILED           [ 33%]
test_hypothesis_module_mark.py::test_hypothesis_wrapper_regular PASSED   [ 66%]
test_hypothesis_module_mark.py::test_hypothesis_wrapper_failing XFAIL    [100%]

=================================== FAILURES ===================================
___________________________ test_hypothesis_wrapper ____________________________

    @given(x=just(1))
>   async def test_hypothesis_wrapper(x):

test_hypothesis_module_mark.py:9:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <hypothesis.core.StateForActualGivenExecution object at 0x7efaf0ddc3a0>
data = ConjectureData(VALID, 0 bytes, frozen)

    def _execute_once_for_engine(self, data):
        """Wrapper around ``execute_once`` that intercepts test failure
        exceptions and single-test control exceptions, and turns them into
        appropriate method calls to `data` instead.

        This allows the engine to assume that any exception other than
        ``StopTest`` must be a fatal error, and should stop the entire engine.
        """
        try:
            trace = frozenset()
            if (
                self.failed_normally
                and not self.failed_due_to_deadline
                and Phase.shrink in self.settings.phases
                and Phase.explain in self.settings.phases
                and sys.gettrace() is None
                and not PYPY
            ):  # pragma: no cover
                # This is in fact covered by our *non-coverage* tests, but due to the
                # settrace() contention *not* by our coverage tests.  Ah well.
                tracer = Tracer()
                try:
                    sys.settrace(tracer.trace)
                    result = self.execute_once(data)
                    if data.status == Status.VALID:
                        self.explain_traces[None].add(frozenset(tracer.branches))
                finally:
                    sys.settrace(None)
                    trace = frozenset(tracer.branches)
            else:
                result = self.execute_once(data)
            if result is not None:
>               fail_health_check(
                    self.settings,
                    "Tests run under @given should return None, but "
                    f"{self.test.__name__} returned {result!r} instead.",
                    HealthCheck.return_value,
                )
E               hypothesis.errors.FailedHealthCheck: Tests run under @given should return None, but test_hypothesis_wrapper returned <coroutine object test_hypothesis_wrapper at 0x7efaf0bacf40> instead.
E               See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.return_value to the suppress_health_check settings for this test.

/usr/lib/python3.8/site-packages/hypothesis/core.py:692: FailedHealthCheck
----------------------------- Captured stdout call -----------------------------
You can add @seed(134889924673272868111293548963683899098) to this test to reproduce this failure.
=========================== short test summary info ============================
FAILED test_hypothesis_module_mark.py::test_hypothesis_wrapper - hypothesis.e...
==================== 1 failed, 1 passed, 1 xfailed in 0.06s ====================
______________________________________________________________________ test_hypothesis_function_mark _______________________________________________________________________
/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_pytest_plugin.py:272: in test_hypothesis_function_mark
    result.assert_outcomes(passed=2 * len(get_all_backends()), xfailed=2 * len(get_all_backends()))
E   AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E     Omitting 3 identical items, use -vv to show
E     Differing items:
E     {'failed': 2} != {'failed': 0}
E     {'xfailed': 2} != {'xfailed': 4}
E     {'passed': 0} != {'passed': 4}
E     Use -v to get the full diff
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-152/test_hypothesis_function_mark0
collecting ... collected 4 items

test_hypothesis_function_mark.py::test_anyio_mark_first FAILED           [ 25%]
test_hypothesis_function_mark.py::test_anyio_mark_last FAILED            [ 50%]
test_hypothesis_function_mark.py::test_anyio_mark_first_fail XFAIL       [ 75%]
test_hypothesis_function_mark.py::test_anyio_mark_last_fail XFAIL        [100%]

=================================== FAILURES ===================================
____________________________ test_anyio_mark_first _____________________________

    @pytest.mark.anyio
>   @given(x=just(1))

test_hypothesis_function_mark.py:7:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <hypothesis.core.StateForActualGivenExecution object at 0x7efaf09a6190>
data = ConjectureData(VALID, 0 bytes, frozen)

    def _execute_once_for_engine(self, data):
        """Wrapper around ``execute_once`` that intercepts test failure
        exceptions and single-test control exceptions, and turns them into
        appropriate method calls to `data` instead.

        This allows the engine to assume that any exception other than
        ``StopTest`` must be a fatal error, and should stop the entire engine.
        """
        try:
            trace = frozenset()
            if (
                self.failed_normally
                and not self.failed_due_to_deadline
                and Phase.shrink in self.settings.phases
                and Phase.explain in self.settings.phases
                and sys.gettrace() is None
                and not PYPY
            ):  # pragma: no cover
                # This is in fact covered by our *non-coverage* tests, but due to the
                # settrace() contention *not* by our coverage tests.  Ah well.
                tracer = Tracer()
                try:
                    sys.settrace(tracer.trace)
                    result = self.execute_once(data)
                    if data.status == Status.VALID:
                        self.explain_traces[None].add(frozenset(tracer.branches))
                finally:
                    sys.settrace(None)
                    trace = frozenset(tracer.branches)
            else:
                result = self.execute_once(data)
            if result is not None:
>               fail_health_check(
                    self.settings,
                    "Tests run under @given should return None, but "
                    f"{self.test.__name__} returned {result!r} instead.",
                    HealthCheck.return_value,
                )
E               hypothesis.errors.FailedHealthCheck: Tests run under @given should return None, but test_anyio_mark_first returned <coroutine object test_anyio_mark_first at 0x7efaf09c0940> instead.
E               See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.return_value to the suppress_health_check settings for this test.

/usr/lib/python3.8/site-packages/hypothesis/core.py:692: FailedHealthCheck
----------------------------- Captured stdout call -----------------------------
You can add @seed(307602835713646369360981986938257476894) to this test to reproduce this failure.
_____________________________ test_anyio_mark_last _____________________________

>   ???

test_hypothesis_function_mark.py:13:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <hypothesis.core.StateForActualGivenExecution object at 0x7efaf0d68a30>
data = ConjectureData(VALID, 0 bytes, frozen)

    def _execute_once_for_engine(self, data):
        """Wrapper around ``execute_once`` that intercepts test failure
        exceptions and single-test control exceptions, and turns them into
        appropriate method calls to `data` instead.

        This allows the engine to assume that any exception other than
        ``StopTest`` must be a fatal error, and should stop the entire engine.
        """
        try:
            trace = frozenset()
            if (
                self.failed_normally
                and not self.failed_due_to_deadline
                and Phase.shrink in self.settings.phases
                and Phase.explain in self.settings.phases
                and sys.gettrace() is None
                and not PYPY
            ):  # pragma: no cover
                # This is in fact covered by our *non-coverage* tests, but due to the
                # settrace() contention *not* by our coverage tests.  Ah well.
                tracer = Tracer()
                try:
                    sys.settrace(tracer.trace)
                    result = self.execute_once(data)
                    if data.status == Status.VALID:
                        self.explain_traces[None].add(frozenset(tracer.branches))
                finally:
                    sys.settrace(None)
                    trace = frozenset(tracer.branches)
            else:
                result = self.execute_once(data)
            if result is not None:
>               fail_health_check(
                    self.settings,
                    "Tests run under @given should return None, but "
                    f"{self.test.__name__} returned {result!r} instead.",
                    HealthCheck.return_value,
                )
E               hypothesis.errors.FailedHealthCheck: Tests run under @given should return None, but test_anyio_mark_last returned <coroutine object test_anyio_mark_last at 0x7efaf09c0a40> instead.
E               See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.return_value to the suppress_health_check settings for this test.

/usr/lib/python3.8/site-packages/hypothesis/core.py:692: FailedHealthCheck
----------------------------- Captured stdout call -----------------------------
You can add @seed(88465997419669952577493369942677550850) to this test to reproduce this failure.
=========================== short test summary info ============================
FAILED test_hypothesis_function_mark.py::test_anyio_mark_first - hypothesis.e...
FAILED test_hypothesis_function_mark.py::test_anyio_mark_last - hypothesis.er...
========================= 2 failed, 2 xfailed in 0.09s =========================
========================================================================= short test summary info ==========================================================================
SKIPPED [1] tests/test_fileio.py:119: Drive only makes sense on Windows
SKIPPED [1] tests/test_fileio.py:159: Only makes sense on Windows
SKIPPED [3] tests/test_fileio.py:318: os.lchmod() is not available
SKIPPED [1] tests/test_taskgroups.py:967: Cancel messages are only supported on py3.9+
ERROR tests/test_eventloop.py::test_sleep_until[asyncio]
ERROR tests/test_eventloop.py::test_sleep_until[asyncio+uvloop]
ERROR tests/test_eventloop.py::test_sleep_until[trio]
ERROR tests/test_eventloop.py::test_sleep_until_in_past[asyncio]
ERROR tests/test_eventloop.py::test_sleep_until_in_past[asyncio+uvloop]
ERROR tests/test_eventloop.py::test_sleep_until_in_past[trio]
ERROR tests/test_eventloop.py::test_sleep_forever[asyncio]
ERROR tests/test_eventloop.py::test_sleep_forever[asyncio+uvloop]
ERROR tests/test_eventloop.py::test_sleep_forever[trio]
FAILED tests/test_pytest_plugin.py::test_plugin - pytest.PytestUnraisableExceptionWarning: Exception ignored in: <coroutine object async_fixture at 0x7efaf0dd7240>
FAILED tests/test_pytest_plugin.py::test_asyncio - pytest.PytestUnraisableExceptionWarning: Exception ignored in: <coroutine object TestClassFixtures.async_class_fixture...
FAILED tests/test_pytest_plugin.py::test_autouse_async_fixture - AssertionError: assert {'errors': 1,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_pytest_plugin.py::test_cancel_scope_in_asyncgen_fixture - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_pytest_plugin.py::test_hypothesis_module_mark - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_pytest_plugin.py::test_hypothesis_function_mark - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
=========================================================== 6 failed, 1224 passed, 6 skipped, 9 errors in 39.57s ===========================================================
agronholm commented 3 years ago

Ok, I did some digging and it seems that the actual plugin names are different (hypothesispytest, pytest_mock). Enabling the latter gets rid of the event loop test failures but the pytest plugin failures remain, probably because it passes the environment variables through but not the plugin options.

kloczek commented 3 years ago

Here is result

+ /usr/bin/pytest -ra -p anyio -p hypothesispytest -p pytest_mock
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1
rootdir: /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1, configfile: pyproject.toml, testpaths: tests
plugins: anyio-3.3.1, hypothesis-6.14.6, mock-3.6.1
collected 1245 items

tests/test_compat.py .......................................................................................                                                         [  6%]
tests/test_debugging.py .......................                                                                                                                      [  8%]
tests/test_eventloop.py .........                                                                                                                                    [  9%]
tests/test_fileio.py .........................s...........s...........................................................sss........................................... [ 21%]
....................                                                                                                                                                 [ 22%]
tests/test_from_thread.py .............................................................................                                                              [ 28%]
tests/test_lowlevel.py ...........................                                                                                                                   [ 31%]
tests/test_pytest_plugin.py FFFFFF                                                                                                                                   [ 31%]
tests/test_signals.py .........                                                                                                                                      [ 32%]
tests/test_sockets.py .............................................................................................................................................. [ 43%]
.................................................................................................................................................................... [ 56%]
.......................                                                                                                                                              [ 58%]
tests/test_subprocesses.py ..................                                                                                                                        [ 60%]
tests/test_synchronization.py ...................................................................................................                                    [ 68%]
tests/test_taskgroups.py ........................................................................................................................................... [ 79%]
.....................................s                                                                                                                               [ 82%]
tests/test_to_process.py .....................                                                                                                                       [ 83%]
tests/test_to_thread.py ........................                                                                                                                     [ 85%]
tests/streams/test_buffered.py ............                                                                                                                          [ 86%]
tests/streams/test_file.py ..............................                                                                                                            [ 89%]
tests/streams/test_memory.py .................................................................                                                                       [ 94%]
tests/streams/test_stapled.py ..................                                                                                                                     [ 95%]
tests/streams/test_text.py ...............                                                                                                                           [ 97%]
tests/streams/test_tls.py ....................................                                                                                                       [100%]

================================================================================= FAILURES =================================================================================
_______________________________________________________________________________ test_plugin ________________________________________________________________________________
/usr/lib/python3.8/site-packages/_pytest/runner.py:311: in from_call
    result: Optional[TResult] = func()
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/unraisableexception.py:88: in pytest_runtest_call
    yield from unraisable_exception_runtest_hook()
/usr/lib/python3.8/site-packages/_pytest/unraisableexception.py:78: in unraisable_exception_runtest_hook
    warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))
E   pytest.PytestUnraisableExceptionWarning: Exception ignored in: <coroutine object async_fixture at 0x7f75dcda79c0>
E
E   Traceback (most recent call last):
E     File "/usr/lib64/python3.8/warnings.py", line 506, in _warn_unawaited_coroutine
E       warn(msg, category=RuntimeWarning, stacklevel=2, source=coro)
E   RuntimeWarning: coroutine 'async_fixture' was never awaited
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-153/test_plugin0
collecting ... collected 4 items

test_plugin.py::test_marked_test FAILED                                  [ 25%]
test_plugin.py::test_async_fixture_from_marked_test FAILED               [ 50%]
test_plugin.py::test_async_fixture_from_sync_test ERROR                  [ 75%]
test_plugin.py::test_skip_inline FAILED                                  [100%]

==================================== ERRORS ====================================
_____________ ERROR at setup of test_async_fixture_from_sync_test ______________
file /tmp/pytest-of-tkloczko/pytest-153/test_plugin0/test_plugin.py, line 20
  def test_async_fixture_from_sync_test(anyio_backend_name, async_fixture):
E       fixture 'anyio_backend_name' not found
>       available fixtures: async_fixture, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, some_feature, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/tmp/pytest-of-tkloczko/pytest-153/test_plugin0/test_plugin.py:20
=================================== FAILURES ===================================
_______________________________ test_marked_test _______________________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f75dce904c0>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_plugin.py::test_marked_test'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
_____________________ test_async_fixture_from_marked_test ______________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f75dce143a0>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_plugin.py::test_async_fixture_from_marked_test'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
_______________________________ test_skip_inline _______________________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f75dce90af0>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_plugin.py::test_skip_inline'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
=========================== short test summary info ============================
FAILED test_plugin.py::test_marked_test - pytest.PytestUnhandledCoroutineWarn...
FAILED test_plugin.py::test_async_fixture_from_marked_test - pytest.PytestUnh...
FAILED test_plugin.py::test_skip_inline - pytest.PytestUnhandledCoroutineWarn...
ERROR test_plugin.py::test_async_fixture_from_sync_test
========================== 3 failed, 1 error in 0.60s ==========================
_______________________________________________________________________________ test_asyncio _______________________________________________________________________________
/usr/lib/python3.8/site-packages/_pytest/runner.py:311: in from_call
    result: Optional[TResult] = func()
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/unraisableexception.py:88: in pytest_runtest_call
    yield from unraisable_exception_runtest_hook()
/usr/lib/python3.8/site-packages/_pytest/unraisableexception.py:78: in unraisable_exception_runtest_hook
    warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))
E   pytest.PytestUnraisableExceptionWarning: Exception ignored in: <coroutine object TestClassFixtures.async_class_fixture at 0x7f75dcdbf540>
E
E   Traceback (most recent call last):
E     File "/usr/lib64/python3.8/warnings.py", line 506, in _warn_unawaited_coroutine
E       warn(msg, category=RuntimeWarning, stacklevel=2, source=coro)
E   RuntimeWarning: coroutine 'TestClassFixtures.async_class_fixture' was never awaited
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-153/test_asyncio0
collecting ... collected 4 items

test_asyncio.py::TestClassFixtures::test_class_fixture_in_test_method ERROR [ 25%]
test_asyncio.py::test_callback_exception_during_test FAILED              [ 50%]
test_asyncio.py::test_callback_exception_during_setup FAILED             [ 75%]
test_asyncio.py::test_callback_exception_during_teardown FAILED          [100%]

==================================== ERRORS ====================================
____ ERROR at setup of TestClassFixtures.test_class_fixture_in_test_method _____
file /tmp/pytest-of-tkloczko/pytest-153/test_asyncio0/test_asyncio.py, line 14
      def test_class_fixture_in_test_method(self, async_class_fixture, anyio_backend_name):
E       fixture 'anyio_backend_name' not found
>       available fixtures: anyio_backend, async_class_fixture, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, setup_fail_fixture, teardown_fail_fixture, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/tmp/pytest-of-tkloczko/pytest-153/test_asyncio0/test_asyncio.py:14
=================================== FAILURES ===================================
_____________________ test_callback_exception_during_test ______________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f75dcd63dc0>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_asyncio.py::test_callback_exception_during_test'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
_____________________ test_callback_exception_during_setup _____________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f75dcb66310>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_asyncio.py::test_callback_exception_during_setup'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
___________________ test_callback_exception_during_teardown ____________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f75dcdc3280>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_asyncio.py::test_callback_exception_during_teardown'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
=========================== short test summary info ============================
FAILED test_asyncio.py::test_callback_exception_during_test - pytest.PytestUn...
FAILED test_asyncio.py::test_callback_exception_during_setup - pytest.PytestU...
FAILED test_asyncio.py::test_callback_exception_during_teardown - pytest.Pyte...
ERROR test_asyncio.py::TestClassFixtures::test_class_fixture_in_test_method
========================== 3 failed, 1 error in 0.50s ==========================
________________________________________________________________________ test_autouse_async_fixture ________________________________________________________________________
/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_pytest_plugin.py:175: in test_autouse_async_fixture
    result.assert_outcomes(passed=len(get_all_backends()))
E   AssertionError: assert {'errors': 1,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E     Omitting 4 identical items, use -vv to show
E     Differing items:
E     {'errors': 1} != {'errors': 0}
E     {'passed': 0} != {'passed': 2}
E     Use -v to get the full diff
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-153/test_autouse_async_fixture0
collecting ... collected 1 item

test_autouse_async_fixture.py::test_autouse_backend ERROR                [100%]

==================================== ERRORS ====================================
____________________ ERROR at setup of test_autouse_backend ____________________
file /tmp/pytest-of-tkloczko/pytest-153/test_autouse_async_fixture0/test_autouse_async_fixture.py, line 7
  def test_autouse_backend(autouse_backend_name):
file /tmp/pytest-of-tkloczko/pytest-153/test_autouse_async_fixture0/conftest.py, line 6
  @pytest.fixture(autouse=True)
  async def autouse_async_fixture(anyio_backend_name):
      global autouse_backend
      autouse_backend = anyio_backend_name
E       fixture 'anyio_backend_name' not found
>       available fixtures: autouse_async_fixture, autouse_backend_name, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

/tmp/pytest-of-tkloczko/pytest-153/test_autouse_async_fixture0/conftest.py:6
=========================== short test summary info ============================
ERROR test_autouse_async_fixture.py::test_autouse_backend
=============================== 1 error in 0.01s ===============================
__________________________________________________________________ test_cancel_scope_in_asyncgen_fixture ___________________________________________________________________
/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_pytest_plugin.py:202: in test_cancel_scope_in_asyncgen_fixture
    result.assert_outcomes(passed=len(get_all_backends()))
E   AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E     Omitting 4 identical items, use -vv to show
E     Differing items:
E     {'passed': 0} != {'passed': 2}
E     {'failed': 1} != {'failed': 0}
E     Use -v to get the full diff
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-153/test_cancel_scope_in_asyncgen_fixture0
collecting ... collected 1 item

test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture FAILED [100%]

=================================== FAILURES ===================================
_______________________ test_cancel_in_asyncgen_fixture ________________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7f75dcb78700>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.8/site-packages/_pytest/runner.py:311:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3.8/site-packages/_pytest/runner.py:255: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/runner.py:170: in pytest_runtest_call
    raise e
/usr/lib/python3.8/site-packages/_pytest/runner.py:162: in pytest_runtest_call
    item.runtest()
/usr/lib/python3.8/site-packages/_pytest/python.py:1641: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/usr/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
/usr/lib/python3.8/site-packages/pluggy/manager.py:337: in traced_hookexec
    return outcome.get_result()
/usr/lib/python3.8/site-packages/pluggy/manager.py:335: in <lambda>
    outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
/usr/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
/usr/lib/python3.8/site-packages/_pytest/python.py:180: in pytest_pyfunc_call
    async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

nodeid = 'test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture'

    def async_warn_and_skip(nodeid: str) -> None:
        msg = "async def functions are not natively supported and have been skipped.\n"
        msg += (
            "You need to install a suitable plugin for your async framework, for example:\n"
        )
        msg += "  - anyio\n"
        msg += "  - pytest-asyncio\n"
        msg += "  - pytest-tornasync\n"
        msg += "  - pytest-trio\n"
        msg += "  - pytest-twisted"
>       warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E       pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E       You need to install a suitable plugin for your async framework, for example:
E         - anyio
E         - pytest-asyncio
E         - pytest-tornasync
E         - pytest-trio
E         - pytest-twisted

/usr/lib/python3.8/site-packages/_pytest/python.py:172: PytestUnhandledCoroutineWarning
=========================== short test summary info ============================
FAILED test_cancel_scope_in_asyncgen_fixture.py::test_cancel_in_asyncgen_fixture
============================== 1 failed in 0.12s ===============================
_______________________________________________________________________ test_hypothesis_module_mark ________________________________________________________________________
/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_pytest_plugin.py:233: in test_hypothesis_module_mark
    result.assert_outcomes(passed=len(get_all_backends()) + 1, xfailed=len(get_all_backends()))
E   AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E     Omitting 3 identical items, use -vv to show
E     Differing items:
E     {'xfailed': 1} != {'xfailed': 2}
E     {'passed': 1} != {'passed': 3}
E     {'failed': 1} != {'failed': 0}
E     Use -v to get the full diff
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-153/test_hypothesis_module_mark0
collecting ... collected 3 items

test_hypothesis_module_mark.py::test_hypothesis_wrapper FAILED           [ 33%]
test_hypothesis_module_mark.py::test_hypothesis_wrapper_regular PASSED   [ 66%]
test_hypothesis_module_mark.py::test_hypothesis_wrapper_failing XFAIL    [100%]

=================================== FAILURES ===================================
___________________________ test_hypothesis_wrapper ____________________________

    @given(x=just(1))
>   async def test_hypothesis_wrapper(x):

test_hypothesis_module_mark.py:9:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <hypothesis.core.StateForActualGivenExecution object at 0x7f75dcd2a3a0>
data = ConjectureData(VALID, 0 bytes, frozen)

    def _execute_once_for_engine(self, data):
        """Wrapper around ``execute_once`` that intercepts test failure
        exceptions and single-test control exceptions, and turns them into
        appropriate method calls to `data` instead.

        This allows the engine to assume that any exception other than
        ``StopTest`` must be a fatal error, and should stop the entire engine.
        """
        try:
            trace = frozenset()
            if (
                self.failed_normally
                and not self.failed_due_to_deadline
                and Phase.shrink in self.settings.phases
                and Phase.explain in self.settings.phases
                and sys.gettrace() is None
                and not PYPY
            ):  # pragma: no cover
                # This is in fact covered by our *non-coverage* tests, but due to the
                # settrace() contention *not* by our coverage tests.  Ah well.
                tracer = Tracer()
                try:
                    sys.settrace(tracer.trace)
                    result = self.execute_once(data)
                    if data.status == Status.VALID:
                        self.explain_traces[None].add(frozenset(tracer.branches))
                finally:
                    sys.settrace(None)
                    trace = frozenset(tracer.branches)
            else:
                result = self.execute_once(data)
            if result is not None:
>               fail_health_check(
                    self.settings,
                    "Tests run under @given should return None, but "
                    f"{self.test.__name__} returned {result!r} instead.",
                    HealthCheck.return_value,
                )
E               hypothesis.errors.FailedHealthCheck: Tests run under @given should return None, but test_hypothesis_wrapper returned <coroutine object test_hypothesis_wrapper at 0x7f75dcc83240> instead.
E               See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.return_value to the suppress_health_check settings for this test.

/usr/lib/python3.8/site-packages/hypothesis/core.py:692: FailedHealthCheck
----------------------------- Captured stdout call -----------------------------
You can add @seed(189988880684927910673697087622098632454) to this test or run pytest with --hypothesis-seed=189988880684927910673697087622098632454 to reproduce this failure.
=========================== short test summary info ============================
FAILED test_hypothesis_module_mark.py::test_hypothesis_wrapper - hypothesis.e...
==================== 1 failed, 1 passed, 1 xfailed in 0.05s ====================
______________________________________________________________________ test_hypothesis_function_mark _______________________________________________________________________
/home/tkloczko/rpmbuild/BUILD/anyio-3.3.1/tests/test_pytest_plugin.py:272: in test_hypothesis_function_mark
    result.assert_outcomes(passed=2 * len(get_all_backends()), xfailed=2 * len(get_all_backends()))
E   AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E     Omitting 3 identical items, use -vv to show
E     Differing items:
E     {'xfailed': 2} != {'xfailed': 4}
E     {'passed': 0} != {'passed': 4}
E     {'failed': 2} != {'failed': 0}
E     Use -v to get the full diff
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-153/test_hypothesis_function_mark0
collecting ... collected 4 items

test_hypothesis_function_mark.py::test_anyio_mark_first FAILED           [ 25%]
test_hypothesis_function_mark.py::test_anyio_mark_last FAILED            [ 50%]
test_hypothesis_function_mark.py::test_anyio_mark_first_fail XFAIL       [ 75%]
test_hypothesis_function_mark.py::test_anyio_mark_last_fail XFAIL        [100%]

=================================== FAILURES ===================================
____________________________ test_anyio_mark_first _____________________________

    @pytest.mark.anyio
>   @given(x=just(1))

test_hypothesis_function_mark.py:7:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <hypothesis.core.StateForActualGivenExecution object at 0x7f75dcbd1160>
data = ConjectureData(VALID, 0 bytes, frozen)

    def _execute_once_for_engine(self, data):
        """Wrapper around ``execute_once`` that intercepts test failure
        exceptions and single-test control exceptions, and turns them into
        appropriate method calls to `data` instead.

        This allows the engine to assume that any exception other than
        ``StopTest`` must be a fatal error, and should stop the entire engine.
        """
        try:
            trace = frozenset()
            if (
                self.failed_normally
                and not self.failed_due_to_deadline
                and Phase.shrink in self.settings.phases
                and Phase.explain in self.settings.phases
                and sys.gettrace() is None
                and not PYPY
            ):  # pragma: no cover
                # This is in fact covered by our *non-coverage* tests, but due to the
                # settrace() contention *not* by our coverage tests.  Ah well.
                tracer = Tracer()
                try:
                    sys.settrace(tracer.trace)
                    result = self.execute_once(data)
                    if data.status == Status.VALID:
                        self.explain_traces[None].add(frozenset(tracer.branches))
                finally:
                    sys.settrace(None)
                    trace = frozenset(tracer.branches)
            else:
                result = self.execute_once(data)
            if result is not None:
>               fail_health_check(
                    self.settings,
                    "Tests run under @given should return None, but "
                    f"{self.test.__name__} returned {result!r} instead.",
                    HealthCheck.return_value,
                )
E               hypothesis.errors.FailedHealthCheck: Tests run under @given should return None, but test_anyio_mark_first returned <coroutine object test_anyio_mark_first at 0x7f75dca89ec0> instead.
E               See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.return_value to the suppress_health_check settings for this test.

/usr/lib/python3.8/site-packages/hypothesis/core.py:692: FailedHealthCheck
----------------------------- Captured stdout call -----------------------------
You can add @seed(78429032076484962500020332099571805145) to this test or run pytest with --hypothesis-seed=78429032076484962500020332099571805145 to reproduce this failure.
_____________________________ test_anyio_mark_last _____________________________

>   ???

test_hypothesis_function_mark.py:13:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <hypothesis.core.StateForActualGivenExecution object at 0x7f75dcbb0970>
data = ConjectureData(VALID, 0 bytes, frozen)

    def _execute_once_for_engine(self, data):
        """Wrapper around ``execute_once`` that intercepts test failure
        exceptions and single-test control exceptions, and turns them into
        appropriate method calls to `data` instead.

        This allows the engine to assume that any exception other than
        ``StopTest`` must be a fatal error, and should stop the entire engine.
        """
        try:
            trace = frozenset()
            if (
                self.failed_normally
                and not self.failed_due_to_deadline
                and Phase.shrink in self.settings.phases
                and Phase.explain in self.settings.phases
                and sys.gettrace() is None
                and not PYPY
            ):  # pragma: no cover
                # This is in fact covered by our *non-coverage* tests, but due to the
                # settrace() contention *not* by our coverage tests.  Ah well.
                tracer = Tracer()
                try:
                    sys.settrace(tracer.trace)
                    result = self.execute_once(data)
                    if data.status == Status.VALID:
                        self.explain_traces[None].add(frozenset(tracer.branches))
                finally:
                    sys.settrace(None)
                    trace = frozenset(tracer.branches)
            else:
                result = self.execute_once(data)
            if result is not None:
>               fail_health_check(
                    self.settings,
                    "Tests run under @given should return None, but "
                    f"{self.test.__name__} returned {result!r} instead.",
                    HealthCheck.return_value,
                )
E               hypothesis.errors.FailedHealthCheck: Tests run under @given should return None, but test_anyio_mark_last returned <coroutine object test_anyio_mark_last at 0x7f75dcbc1040> instead.
E               See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.return_value to the suppress_health_check settings for this test.

/usr/lib/python3.8/site-packages/hypothesis/core.py:692: FailedHealthCheck
----------------------------- Captured stdout call -----------------------------
You can add @seed(332937503774009953785629465287997010924) to this test or run pytest with --hypothesis-seed=332937503774009953785629465287997010924 to reproduce this failure.
=========================== short test summary info ============================
FAILED test_hypothesis_function_mark.py::test_anyio_mark_first - hypothesis.e...
FAILED test_hypothesis_function_mark.py::test_anyio_mark_last - hypothesis.er...
========================= 2 failed, 2 xfailed in 0.13s =========================
========================================================================= short test summary info ==========================================================================
SKIPPED [1] tests/test_fileio.py:119: Drive only makes sense on Windows
SKIPPED [1] tests/test_fileio.py:159: Only makes sense on Windows
SKIPPED [3] tests/test_fileio.py:318: os.lchmod() is not available
SKIPPED [1] tests/test_taskgroups.py:967: Cancel messages are only supported on py3.9+
FAILED tests/test_pytest_plugin.py::test_plugin - pytest.PytestUnraisableExceptionWarning: Exception ignored in: <coroutine object async_fixture at 0x7f75dcda79c0>
FAILED tests/test_pytest_plugin.py::test_asyncio - pytest.PytestUnraisableExceptionWarning: Exception ignored in: <coroutine object TestClassFixtures.async_class_fixture...
FAILED tests/test_pytest_plugin.py::test_autouse_async_fixture - AssertionError: assert {'errors': 1,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_pytest_plugin.py::test_cancel_scope_in_asyncgen_fixture - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_pytest_plugin.py::test_hypothesis_module_mark - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
FAILED tests/test_pytest_plugin.py::test_hypothesis_function_mark - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
================================================================ 6 failed, 1233 passed, 6 skipped in 39.63s ================================================================
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-2a8bceb6-beb0-40e9-86f2-9d70e8afcfa3/test_rmtree_errorhandler_reado0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_rmtree_errorhandler_reado0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-2a8bceb6-beb0-40e9-86f2-9d70e8afcfa3/test_rmtree_errorhandler_rerai0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_rmtree_errorhandler_rerai0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-2a8bceb6-beb0-40e9-86f2-9d70e8afcfa3/test_safe_get_no_perms0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_safe_get_no_perms0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-2a8bceb6-beb0-40e9-86f2-9d70e8afcfa3/test_safe_set_no_perms0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_safe_set_no_perms0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-2a8bceb6-beb0-40e9-86f2-9d70e8afcfa3/test_safe_delete_no_perms0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_safe_delete_no_perms0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-2a8bceb6-beb0-40e9-86f2-9d70e8afcfa3
<class 'OSError'>: [Errno 39] Directory not empty: '/tmp/pytest-of-tkloczko/garbage-2a8bceb6-beb0-40e9-86f2-9d70e8afcfa3'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-882efed6-7a9b-465b-b65a-7f3ef89907dd/test_rmtree_errorhandler_reado0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_rmtree_errorhandler_reado0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-882efed6-7a9b-465b-b65a-7f3ef89907dd/test_rmtree_errorhandler_rerai0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_rmtree_errorhandler_rerai0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-882efed6-7a9b-465b-b65a-7f3ef89907dd/test_safe_set_no_perms0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_safe_set_no_perms0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-882efed6-7a9b-465b-b65a-7f3ef89907dd/test_safe_delete_no_perms0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_safe_delete_no_perms0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-882efed6-7a9b-465b-b65a-7f3ef89907dd/test_safe_get_no_perms0
<class 'OSError'>: [Errno 39] Directory not empty: 'test_safe_get_no_perms0'
  warnings.warn(
/usr/lib/python3.8/site-packages/_pytest/pathlib.py:80: PytestWarning: (rm_rf) error removing /tmp/pytest-of-tkloczko/garbage-882efed6-7a9b-465b-b65a-7f3ef89907dd
<class 'OSError'>: [Errno 39] Directory not empty: '/tmp/pytest-of-tkloczko/garbage-882efed6-7a9b-465b-b65a-7f3ef89907dd'
  warnings.warn(
agronholm commented 3 years ago

Ok, what if you patch test_pytest_plugin.py to explicitly enable the anyio plugin (in several places), like so:

    result = testdir.runpytest('-v', '-p', 'no:asyncio', '-p' 'anyio')
agronholm commented 3 years ago

Do you still need help or is this resolved?

kloczek commented 2 years ago

Just back to anyio and after may upogrades of the pythom modules current state is like below: So here is current state:

+ PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.1-3.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.1-3.fc35.x86_64/usr/lib/python3.8/site-packages
+ /usr/bin/pytest -ra
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1
rootdir: /home/tkloczko/rpmbuild/BUILD/anyio-3.3.1, configfile: pyproject.toml, testpaths: tests
collected 6 items / 19 errors

================================================================================== ERRORS ==================================================================================
__________________________________________________________________ ERROR collecting tests/test_compat.py ___________________________________________________________________
'anyio' not found in `markers` configuration option
_________________________________________________________________ ERROR collecting tests/test_debugging.py _________________________________________________________________
'anyio' not found in `markers` configuration option
_________________________________________________________________ ERROR collecting tests/test_eventloop.py _________________________________________________________________
'anyio' not found in `markers` configuration option
__________________________________________________________________ ERROR collecting tests/test_fileio.py ___________________________________________________________________
'anyio' not found in `markers` configuration option
________________________________________________________________ ERROR collecting tests/test_from_thread.py ________________________________________________________________
'anyio' not found in `markers` configuration option
_________________________________________________________________ ERROR collecting tests/test_lowlevel.py __________________________________________________________________
'anyio' not found in `markers` configuration option
__________________________________________________________________ ERROR collecting tests/test_signals.py __________________________________________________________________
'anyio' not found in `markers` configuration option
__________________________________________________________________ ERROR collecting tests/test_sockets.py __________________________________________________________________
'anyio' not found in `markers` configuration option
_______________________________________________________________ ERROR collecting tests/test_subprocesses.py ________________________________________________________________
'anyio' not found in `markers` configuration option
______________________________________________________________ ERROR collecting tests/test_synchronization.py ______________________________________________________________
'anyio' not found in `markers` configuration option
________________________________________________________________ ERROR collecting tests/test_taskgroups.py _________________________________________________________________
'anyio' not found in `markers` configuration option
________________________________________________________________ ERROR collecting tests/test_to_process.py _________________________________________________________________
'anyio' not found in `markers` configuration option
_________________________________________________________________ ERROR collecting tests/test_to_thread.py _________________________________________________________________
'anyio' not found in `markers` configuration option
_____________________________________________________________ ERROR collecting tests/streams/test_buffered.py ______________________________________________________________
'anyio' not found in `markers` configuration option
_______________________________________________________________ ERROR collecting tests/streams/test_file.py ________________________________________________________________
'anyio' not found in `markers` configuration option
______________________________________________________________ ERROR collecting tests/streams/test_memory.py _______________________________________________________________
'anyio' not found in `markers` configuration option
______________________________________________________________ ERROR collecting tests/streams/test_stapled.py ______________________________________________________________
'anyio' not found in `markers` configuration option
_______________________________________________________________ ERROR collecting tests/streams/test_text.py ________________________________________________________________
'anyio' not found in `markers` configuration option
________________________________________________________________ ERROR collecting tests/streams/test_tls.py ________________________________________________________________
'anyio' not found in `markers` configuration option
========================================================================= short test summary info ==========================================================================
ERROR tests/test_compat.py
ERROR tests/test_debugging.py
ERROR tests/test_eventloop.py
ERROR tests/test_fileio.py
ERROR tests/test_from_thread.py
ERROR tests/test_lowlevel.py
ERROR tests/test_signals.py
ERROR tests/test_sockets.py
ERROR tests/test_subprocesses.py
ERROR tests/test_synchronization.py
ERROR tests/test_taskgroups.py
ERROR tests/test_to_process.py
ERROR tests/test_to_thread.py
ERROR tests/streams/test_buffered.py
ERROR tests/streams/test_file.py
ERROR tests/streams/test_memory.py
ERROR tests/streams/test_stapled.py
ERROR tests/streams/test_text.py
ERROR tests/streams/test_tls.py
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 19 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
============================================================================ 19 errors in 0.83s ============================================================================
agronholm commented 2 years ago

Could you provide me step-by-step instructions on how to reproduce this result?

agronholm commented 2 years ago

Also, it's best to package v3.3.4 instead as it contains fixes for a few nasty stream related bugs.

kloczek commented 2 years ago

Just retested 3.3.4 using belwm methodology:

+ PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages
+ /usr/bin/pytest -ra
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1
rootdir: /home/tkloczko/rpmbuild/BUILD/anyio-3.3.4, configfile: pyproject.toml, testpaths: tests
collected 6 items / 19 errors

================================================================================== ERRORS ==================================================================================
__________________________________________________________________ ERROR collecting tests/test_compat.py ___________________________________________________________________
'anyio' not found in `markers` configuration option
_________________________________________________________________ ERROR collecting tests/test_debugging.py _________________________________________________________________
'anyio' not found in `markers` configuration option
_________________________________________________________________ ERROR collecting tests/test_eventloop.py _________________________________________________________________
'anyio' not found in `markers` configuration option
__________________________________________________________________ ERROR collecting tests/test_fileio.py ___________________________________________________________________
'anyio' not found in `markers` configuration option
________________________________________________________________ ERROR collecting tests/test_from_thread.py ________________________________________________________________
'anyio' not found in `markers` configuration option
_________________________________________________________________ ERROR collecting tests/test_lowlevel.py __________________________________________________________________
'anyio' not found in `markers` configuration option
__________________________________________________________________ ERROR collecting tests/test_signals.py __________________________________________________________________
'anyio' not found in `markers` configuration option
__________________________________________________________________ ERROR collecting tests/test_sockets.py __________________________________________________________________
'anyio' not found in `markers` configuration option
_______________________________________________________________ ERROR collecting tests/test_subprocesses.py ________________________________________________________________
'anyio' not found in `markers` configuration option
______________________________________________________________ ERROR collecting tests/test_synchronization.py ______________________________________________________________
'anyio' not found in `markers` configuration option
________________________________________________________________ ERROR collecting tests/test_taskgroups.py _________________________________________________________________
'anyio' not found in `markers` configuration option
________________________________________________________________ ERROR collecting tests/test_to_process.py _________________________________________________________________
'anyio' not found in `markers` configuration option
_________________________________________________________________ ERROR collecting tests/test_to_thread.py _________________________________________________________________
'anyio' not found in `markers` configuration option
_____________________________________________________________ ERROR collecting tests/streams/test_buffered.py ______________________________________________________________
'anyio' not found in `markers` configuration option
_______________________________________________________________ ERROR collecting tests/streams/test_file.py ________________________________________________________________
'anyio' not found in `markers` configuration option
______________________________________________________________ ERROR collecting tests/streams/test_memory.py _______________________________________________________________
'anyio' not found in `markers` configuration option
______________________________________________________________ ERROR collecting tests/streams/test_stapled.py ______________________________________________________________
'anyio' not found in `markers` configuration option
_______________________________________________________________ ERROR collecting tests/streams/test_text.py ________________________________________________________________
'anyio' not found in `markers` configuration option
________________________________________________________________ ERROR collecting tests/streams/test_tls.py ________________________________________________________________
'anyio' not found in `markers` configuration option
========================================================================= short test summary info ==========================================================================
ERROR tests/test_compat.py
ERROR tests/test_debugging.py
ERROR tests/test_eventloop.py
ERROR tests/test_fileio.py
ERROR tests/test_from_thread.py
ERROR tests/test_lowlevel.py
ERROR tests/test_signals.py
ERROR tests/test_sockets.py
ERROR tests/test_subprocesses.py
ERROR tests/test_synchronization.py
ERROR tests/test_taskgroups.py
ERROR tests/test_to_process.py
ERROR tests/test_to_thread.py
ERROR tests/streams/test_buffered.py
ERROR tests/streams/test_file.py
ERROR tests/streams/test_memory.py
ERROR tests/streams/test_stapled.py
ERROR tests/streams/test_text.py
ERROR tests/streams/test_tls.py
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 19 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
============================================================================ 19 errors in 0.86s ============================================================================
agronholm commented 2 years ago

I didn't think 3.3.4 would help with any of these test suite issues you're experiencing, but upgrading would be good anyway.

AnyIO requires its own plugin in order to run the tests. For that, I believe you need to have AnyIO's .egg-info / .dist-info directory on your PYTHONPATH. Can you confirm whether this is the case or not?

kloczek commented 2 years ago

setuptools install command automatically install in correct place .egg.info

+ /usr/bin/python3 setup.py install -O1 --skip-build --root /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64
running install
running install_lib
creating /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr
creating /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib
creating /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8
creating /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages
creating /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio
copying build/lib/anyio/__init__.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio
copying build/lib/anyio/from_thread.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio
copying build/lib/anyio/lowlevel.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio
copying build/lib/anyio/pytest_plugin.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio
copying build/lib/anyio/to_process.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio
copying build/lib/anyio/to_thread.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio
creating /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_backends
copying build/lib/anyio/_backends/__init__.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_backends
copying build/lib/anyio/_backends/_asyncio.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_backends
copying build/lib/anyio/_backends/_trio.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_backends
creating /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core
copying build/lib/anyio/_core/__init__.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core
copying build/lib/anyio/_core/_compat.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core
copying build/lib/anyio/_core/_eventloop.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core
copying build/lib/anyio/_core/_exceptions.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core
copying build/lib/anyio/_core/_fileio.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core
copying build/lib/anyio/_core/_resources.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core
copying build/lib/anyio/_core/_signals.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core
copying build/lib/anyio/_core/_sockets.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core
copying build/lib/anyio/_core/_streams.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core
copying build/lib/anyio/_core/_subprocesses.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core
copying build/lib/anyio/_core/_synchronization.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core
copying build/lib/anyio/_core/_tasks.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core
copying build/lib/anyio/_core/_testing.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core
copying build/lib/anyio/_core/_typedattr.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core
creating /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/abc
copying build/lib/anyio/abc/__init__.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/abc
copying build/lib/anyio/abc/_resources.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/abc
copying build/lib/anyio/abc/_sockets.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/abc
copying build/lib/anyio/abc/_streams.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/abc
copying build/lib/anyio/abc/_subprocesses.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/abc
copying build/lib/anyio/abc/_tasks.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/abc
copying build/lib/anyio/abc/_testing.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/abc
creating /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/streams
copying build/lib/anyio/streams/__init__.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/streams
copying build/lib/anyio/streams/buffered.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/streams
copying build/lib/anyio/streams/file.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/streams
copying build/lib/anyio/streams/memory.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/streams
copying build/lib/anyio/streams/stapled.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/streams
copying build/lib/anyio/streams/text.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/streams
copying build/lib/anyio/streams/tls.py -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/streams
copying build/lib/anyio/py.typed -> /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/__init__.py to __init__.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/from_thread.py to from_thread.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/lowlevel.py to lowlevel.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/pytest_plugin.py to pytest_plugin.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/to_process.py to to_process.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/to_thread.py to to_thread.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_backends/__init__.py to __init__.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_backends/_asyncio.py to _asyncio.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_backends/_trio.py to _trio.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core/__init__.py to __init__.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core/_compat.py to _compat.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core/_eventloop.py to _eventloop.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core/_exceptions.py to _exceptions.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core/_fileio.py to _fileio.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core/_resources.py to _resources.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core/_signals.py to _signals.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core/_sockets.py to _sockets.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core/_streams.py to _streams.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core/_subprocesses.py to _subprocesses.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core/_synchronization.py to _synchronization.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core/_tasks.py to _tasks.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core/_testing.py to _testing.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/_core/_typedattr.py to _typedattr.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/abc/__init__.py to __init__.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/abc/_resources.py to _resources.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/abc/_sockets.py to _sockets.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/abc/_streams.py to _streams.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/abc/_subprocesses.py to _subprocesses.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/abc/_tasks.py to _tasks.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/abc/_testing.py to _testing.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/streams/__init__.py to __init__.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/streams/buffered.py to buffered.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/streams/file.py to file.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/streams/memory.py to memory.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/streams/stapled.py to stapled.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/streams/text.py to text.cpython-38.pyc
byte-compiling /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio/streams/tls.py to tls.cpython-38.pyc
writing byte-compilation script '/tmp/tmpykhegcil.py'
/usr/bin/python3 /tmp/tmpykhegcil.py
removing /tmp/tmpykhegcil.py
running install_egg_info
running egg_info
writing src/anyio.egg-info/PKG-INFO
writing dependency_links to src/anyio.egg-info/dependency_links.txt
writing entry points to src/anyio.egg-info/entry_points.txt
writing requirements to src/anyio.egg-info/requires.txt
writing top-level names to src/anyio.egg-info/top_level.txt
listing git files failed - pretending there aren't any
reading manifest file 'src/anyio.egg-info/SOURCES.txt'
adding license file 'LICENSE'
writing manifest file 'src/anyio.egg-info/SOURCES.txt'
Copying src/anyio.egg-info to /home/tkloczko/rpmbuild/BUILDROOT/python-anyio-3.3.4-2.fc35.x86_64/usr/lib/python3.8/site-packages/anyio-3.3.4-py3.8.egg-info
running install_scripts
agronholm commented 2 years ago

That setup.py install --root=... does not seem to install dependencies. Are you seeing the same? But even after installing the deps manually to the prefix I get a warning-turned-error from trio:

tests/test_taskgroups.py:8: in <module>
    import trio
/tmp/install/usr/local/lib/python3.9/site-packages/trio/__init__.py:18: in <module>
    from ._core import (
/tmp/install/usr/local/lib/python3.9/site-packages/trio/_core/__init__.py:20: in <module>
    from ._multierror import MultiError
/tmp/install/usr/local/lib/python3.9/site-packages/trio/_core/_multierror.py:511: in <module>
    warnings.warn(
E   RuntimeWarning: You seem to already have a custom sys.excepthook handler installed. I'll skip installing Trio's custom handler, but this means MultiErrors will not show full tracebacks.

Why this happens in this case and not with a regular virtualenv install is a mystery that I haven't figured out yet. In any case I am struggling to reproduce the case you're seeing, where pytest tries to run the test suite but isn't picking up any plugins.

agronholm commented 2 years ago

So, to revisit this, how did you install the dependencies (since none of the commands you listed do that)? Can you provide step by step instructions on how to reproduce the same results as you did?

kloczek commented 2 years ago

I'm trying to package your module as an rpm package. So I'm using the typical build, install and test cycle used on building packages from non-root account.

agronholm commented 2 years ago

So do you install all the dependencies manually via RPM or DNF?

agronholm commented 2 years ago

I tried replicating your test setup as closely as I could using Docker, but the test suite passes just fine. I've attached the resulting Dockerfile. What are you doing differently that still causes errors to appear?

FROM fedora:35

RUN dnf install -y \
    python-pytest \
    python-pytest-mock \
    python-trio \
    python-hypothesis \
    python-uvloop \
    python-trustme \
    git \
    python-pip
RUN adduser testuser
USER testuser:testuser
WORKDIR /home/testuser
RUN curl https://files.pythonhosted.org/packages/66/02/ca9061e93c487a897859e3a41f6c1a4f494038d2791382169b9a0c528175/anyio-3.3.4.tar.gz | tar xz
WORKDIR ./anyio-3.3.4
RUN mkdir $HOME/buildroot
RUN python3 setup.py build
RUN python3 setup.py install --root=$HOME/buildroot
ENV PYTHONPATH=/home/testuser/buildroot/usr/local/lib/python3.10/site-packages
CMD ["pytest"]

Build and run with:

docker build -t anyiotest .
docker run --rm --network host anyiotest
kloczek commented 2 years ago

So do you install all the dependencies manually via RPM or DNF?

Yes ad here is liest of BuildRequires from mysopec file:

BuildRequires:  python3dist(setuptools-scm)
BuildRequires:  python3dist(sphinx)
BuildRequires:  python3dist(sphinx-autodoc-typehints)   >= 1.2
BuildRequires:  python3dist(sphinx-rtd-theme)
# CheckRequires:
BuildRequires:  python3dist(coverage)                   >= 4.5
BuildRequires:  python3dist(curio)
BuildRequires:  python3dist(hypothesis)                 >= 4
BuildRequires:  python3dist(idna)                       >= 2.8
BuildRequires:  python3dist(pytest)             >= 4.3
BuildRequires:  python3dist(sniffio)            >= 1.1
BuildRequires:  python3dist(trio)
BuildRequires:  python3dist(trustme)
BuildRequires:  python3dist(uvloop)
agronholm commented 2 years ago

Coverage and curio are not required by the tests, but pytest-mock is and is missing from the list.

I know I've asked you a few times already, but if after these changes the test run is still failing, can you please provide me a way to reproduce the problem locally?

kloczek commented 2 years ago

pytest-mock is in the dependencies (indirectly).

python-pytest-mock-3.6.1-4.g2v.noarch
agronholm commented 2 years ago

Would you mind coming over to Gitter at https://gitter.im/python-trio/AnyIO so we can talk this through in real time?

kloczek commented 2 years ago

Just joined

agronholm commented 2 years ago

To summarize the problems to anybody else reading this, the OP had altered the test environment in ways that caused the test run to fail:

agronholm commented 2 years ago

Do you need further assistance with this, or did the changes implemented in v3.4.0 solve the problems you had?

kloczek commented 2 years ago

Just tested 3.4.0

+ /usr/bin/pytest -ra
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1
rootdir: /home/tkloczko/rpmbuild/BUILD/anyio-3.4.0, configfile: pyproject.toml, testpaths: tests
plugins: anyio-3.4.0, shutil-1.7.0, virtualenv-1.7.0, mock-3.6.1, cov-2.12.1, forked-1.3.0, xdist-2.3.0, flaky-3.7.0, tornasync-0.6.0.post2, console-scripts-1.2.0, trio-0.7.0, timeout-2.0.1, hypothesis-6.27.0, freezegun-0.4.2, flake8-1.0.7, pyfakefs-4.5.3
collected 285 items / 4 errors / 281 selected

================================================================================== ERRORS ==================================================================================
________________________________________________________________ ERROR collecting tests/test_from_thread.py ________________________________________________________________
In test_asyncio_run_sync_called: function uses no argument 'anyio_backend'
__________________________________________________________________ ERROR collecting tests/test_sockets.py __________________________________________________________________
tests/test_sockets.py:44: in <module>
    s = socket.socket(AddressFamily.AF_INET6)
/usr/lib64/python3.8/socket.py:231: in __init__
    _socket.socket.__init__(self, family, type, proto, fileno)
E   OSError: [Errno 97] Address family not supported by protocol
________________________________________________________________ ERROR collecting tests/test_taskgroups.py _________________________________________________________________
In test_start_native_host_cancelled: function uses no argument 'anyio_backend'
_________________________________________________________________ ERROR collecting tests/test_to_thread.py _________________________________________________________________
In test_asyncio_cancel_native_task: function uses no argument 'anyio_backend'
========================================================================= short test summary info ==========================================================================
ERROR tests/test_from_thread.py::TestBlockingPortal
ERROR tests/test_sockets.py - OSError: [Errno 97] Address family not supported by protocol
ERROR tests/test_taskgroups.py
ERROR tests/test_to_thread.py
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 4 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
============================================================================ 4 errors in 1.01s =============================================================================
agronholm commented 2 years ago

Curious that creating an IPv6 socket fails even though socket.has_ipv6 is True. I could not reproduce this behavior in a container where IPv6 was disabled.

agronholm commented 2 years ago

Do you currently have pytest plugin autoloading disabled or not? v3.4.0 should work with autoloading disabled so if you don't have that, you should add that environment variable.

agronholm commented 2 years ago

I also just pushed a commit to master that should resolve that ipv6 collection error.

agronholm commented 2 years ago

I could run the tests on your behalf but I don't have access to your unique environment any more.

kloczek commented 2 years ago

You can just disable ipv6 stack and run test suite.

agronholm commented 2 years ago

The IPv6 issue should be fixed now, did you try against master yet? And what about pytest plugin autoloading?

kloczek commented 2 years ago

Nope and as I'm busy now will do test later or my automation will try do that when next release will ba tagged.