pytest-dev / pytest-asyncio

Asyncio support for pytest
https://pytest-asyncio.readthedocs.io
Apache License 2.0
1.43k stars 152 forks source link

Async fixtures may break current event loop #868

Open MarkusSintonen opened 4 months ago

MarkusSintonen commented 4 months ago

It seems pytest-asyncio (0.23.7) is currently badly broken with code relying on aiohttp.

This simple example currently breaks. The example is heavily simplified from the actual fixture setup. This used to work fine before 0.22.

import aiohttp
import pytest

# No longer working for getting a single event loop for aiohttp
# @pytest.fixture(scope="session")
# def event_loop():
#    loop = asyncio.get_event_loop_policy().new_event_loop()
#    try:
#        yield loop
#    finally:
#        loop.close()

@pytest.fixture(scope="session")
async def client(
    # event_loop <- accessing this no longer working. But neither this works without it
):
    async with aiohttp.ClientSession(
        # loop=event_loop <- aiohttp requires using a single event_loop that can not be closed between the tests...
    ) as session:
        yield session

async def test_the_client(client: aiohttp.ClientSession):
    await client.get("http://localhost:8080/foobar")  # RuntimeError: Timeout context manager should be used inside a task

Using asyncio_mode = "auto":

[tool.poetry]
name = "foobar"
package-mode = false

[tool.poetry.dependencies]
python = "~3.12"
aiohttp = "==3.9.5"

[tool.poetry.dev-dependencies]
pytest = "==8.2.2"
pytest-asyncio = "==0.23.7"

[tool.pytest.ini_options]
asyncio_mode = "auto"
MarkusSintonen commented 4 months ago

Using this https://pytest-asyncio.readthedocs.io/en/latest/how-to-guides/run_session_tests_in_same_loop.html doesnt have any effect. Seems there are multiple issues now in pytest-asyncio. There is no longer access to a single event loop used for tests. Also pytest-asyncio is now closing the event loop between the tests. How do I get back the previous working behaviour which is compatible eg with aiohttp? It is neither possible to change the session scoped client fixture to a function fixture. (Also it doesnt make sense, we want a single client per test session)

MarkusSintonen commented 4 months ago

There seems to be multiple open issues related to latest version of pytest-asyncio. Couldn't find anything directly related to aiohttp usage.

seifertm commented 4 months ago

Thanks for the code example.

In your specific code, the issue is that the test and the fixture run in different asyncio event loops. Each pytest scope has its own event loop. When specifying a @pytest.fixture(scope="session"), the fixture code runs in a session-scoped loop, whereas test_the_client uses the default function-scoped loop.

Adding @pytest.mark.asyncio(scope="session") to test_the_client make the test code to run in the same session-scoped loop, thus eliminating the issue.

However, there's a tightly linked know issue in pytest-asyncio, which makes it impossible to separate the scope of the event loop from the scope of the pytest fixture. Sometimes, it's necessary to have code that runs in a session-wide event loop, but that code should be re-executed for every function. This is not possible with v0.23 at the moment. If you require this functionality, you should pin pytest-asyncio to v0.21 until the next major release. This issue is tracked in #706.

As for why the code mentioned in the docs doesn't work, that's an entirely different topic that needs investigating.

seifertm commented 4 months ago

When I add the following code snippet to conftest.py (instead of @pytest.mark.asyncio(scope="session" as mentioned in my previous comment), your provided example runs as expected.

import pytest

from pytest_asyncio import is_async_test

def pytest_collection_modifyitems(items):
    pytest_asyncio_tests = (item for item in items if is_async_test(item))
    session_scope_marker = pytest.mark.asyncio(scope="session")
    for async_test in pytest_asyncio_tests:
        async_test.add_marker(session_scope_marker, append=False)

I referred to the exact same documentation page as you did (see How to run all tests in the session in the same event loop).

Do you think you could provide another code example where the code snippet in the docs doesn't solve the issues you're experiencing?

MarkusSintonen commented 4 months ago

@seifertm Thanks

It seems I was missing the append=False parameter 🤔 Strange... It partially works with that. Btw could we instead have some config parameter to not have to add this boilerplate to every test suite (like with asyncio_mode toml param)?

But even with this, I still have problems getting it working with more complex sets of fixtures. I was able to narrow the problems into a next weird case. It again breaks with RuntimeError: Timeout context manager should be used inside a task when adding an async-autouse fixture (to either to the test file or to the conftest):

@pytest.fixture(autouse=True)
async def some_function_scope_autouse_async_fixture():
    pass

Full failing example:

import aiohttp
import pytest

@pytest.fixture(autouse=True)
async def some_function_scope_autouse_async_fixture():
    pass

@pytest.fixture(scope="session")
async def client():
    async with aiohttp.ClientSession() as session:
        yield session

async def test_the_client(client):
    async with client.get("http://localhost:8080/foobar") as resp:  # RuntimeError: Timeout context manager should be used inside a task
        resp.raise_for_status()

The pytest.mark.asyncio(scope="session")-thing in conftest doesnt help here... In real usage Im also seeing Event loop is closed errors. But I wasnt yet able to narrow what exactly causes that other error yet. Lets first investigate the above issue with the autouse fixture...

seifertm commented 4 months ago

Thanks for the reproducer!

The autouse fixture seems to pull in the function-scoped event_loop fixture as a dependency of test_the_client in addition to the session-scoped loop. I can verify this using pytest's --setup-show output:

$ pytest --asyncio-mode=auto --setup-show
===== test session starts =====
platform linux -- Python 3.12.4, pytest-8.2.2, pluggy-1.5.0
rootdir: /tmp/tst
plugins: asyncio-0.23.7
asyncio: mode=Mode.AUTO
collected 1 item                                                                                                                                                                                                                                                                                                    

test_a.py 
SETUP    S event_loop_policy
SETUP    S _session_event_loop (fixtures used: event_loop_policy)
SETUP    S client (fixtures used: _session_event_loop)
        SETUP    F event_loop
        SETUP    F some_function_scope_autouse_async_fixture (fixtures used: event_loop)
        test_a.py::test_the_client (fixtures used: _session_event_loop, client, event_loop, event_loop_policy, request, some_function_scope_autouse_async_fixture)F
        TEARDOWN F some_function_scope_autouse_async_fixture
        TEARDOWN F event_loop
TEARDOWN S client
TEARDOWN S _session_event_loop
TEARDOWN S event_loop_policy

===== FAILURES =====
_____ test_the_client _____

client = <aiohttp.client.ClientSession object at 0x7fa285fbf3b0>

    @pytest.mark.asyncio(scope="session")
    async def test_the_client(client):
>       async with client.get("http://localhost:8080/foobar") as resp:  # RuntimeError: Timeout context manager should be used inside a task

test_a.py:18: 
_ _ _
venv/lib/python3.12/site-packages/aiohttp/client.py:1197: in __aenter__
    self._resp = await self._coro
venv/lib/python3.12/site-packages/aiohttp/client.py:507: in _request
    with timer:
_ _ _

self = <aiohttp.helpers.TimerContext object at 0x7fa285fbcc80>

    def __enter__(self) -> BaseTimerContext:
        task = current_task(loop=self._loop)

        if task is None:
>           raise RuntimeError(
                "Timeout context manager should be used " "inside a task"
            )
E           RuntimeError: Timeout context manager should be used inside a task

venv/lib/python3.12/site-packages/aiohttp/helpers.py:715: RuntimeError
===== short test summary info =====
FAILED test_a.py::test_the_client - RuntimeError: Timeout context manager should be used inside a task
===== 1 failed in 0.12s =====

Since the event_loop fixture is evaluated after the setup of the session-scoped client fixture, the test ends up running in the event_loop loop.

This should actually result in a MultipleEventLoopsRequestedError, but it seems to be circumvented by the autouse mechanism. This is clearly a bug in pytest-asyncio.

Until this is fixed, you can change the scope of some_function_scope_autouse_async_fixture to scope="session", if your test suite allows that.

seifertm commented 4 months ago

It seems I was missing the append=False parameter 🤔 Strange... It partially works with that. Btw could we instead have some config parameter to not have to add this boilerplate to every test suite (like with asyncio_mode toml param)?

This feature has been requested a number of times and it's very reasonable. There are plans to add this kind of "default event loop scope" in #871

MarkusSintonen commented 4 months ago

Until this is fixed, you can change the scope of some_function_scope_autouse_async_fixture to scope="session", if your test suite allows that.

No that's not possible and won't make sense in the real usage. 😕 The example was greatly simplified.

MarkusSintonen commented 4 months ago

@seifertm actually there are more issues here that are not necessarily related to autouse=True with async fixture function usage. It fails with many other cases also:

No autouse=True on async fixture, but fails:

import aiohttp
import pytest

@pytest.fixture
async def function_scope_async_fixture():
    pass

@pytest.fixture(scope="session")
async def client():
    async with aiohttp.ClientSession() as session:
        yield session

async def test_the_client(client, function_scope_async_fixture):
    async with client.get("http://localhost:8080/foobar") as resp:  # RuntimeError: Timeout context manager should be used inside a task
        resp.raise_for_status()

No autouse=True with async-sync fixture chain:

import aiohttp
import pytest

@pytest.fixture
async def function_scope_async_fixture():
    pass

@pytest.fixture
def function_scope_sync_fixture(function_scope_async_fixture):
    pass

@pytest.fixture(scope="session")
async def client():
    async with aiohttp.ClientSession() as session:
        yield session

async def test_the_client(client, function_scope_sync_fixture):
    async with client.get("http://localhost:8080/foobar") as resp:  # RuntimeError: Timeout context manager should be used inside a task
        resp.raise_for_status()

With autouse=True on sync fixture:

import aiohttp
import pytest

@pytest.fixture
async def function_scope_async_fixture():
    pass

@pytest.fixture(autouse=True)
def function_scope_autouse_sync_fixture(function_scope_async_fixture):
    pass

@pytest.fixture(scope="session")
async def client():
    async with aiohttp.ClientSession() as session:
        yield session

async def test_the_client(client):
    async with client.get("http://localhost:8080/foobar") as resp:  # RuntimeError: Timeout context manager should be used inside a task
        resp.raise_for_status()
seifertm commented 4 months ago

Thanks for the investigation. Indeed, the issue is not specifically related to autouse fixtures, but to the usage of mixed event loop scopes in general. The linked PR aims to extend the detection of mixed loop scopes in a single test. However, a proper fix requires that the caching scope of a fixture and its loop scope be independent. In other words, this issue is blocked by #706 .

MarkusSintonen commented 4 months ago

but to the usage of mixed event loop scopes in general

It seems the library is now broken/buggy in general. As none of those examples actually access or configure the event loop directly. They are just using pretty normal async code with different fixture scopes. There is nothing special in them. It is then better to avoid using latest versions of pytest-asyncio until the regressions are fixed. 🙏

MarkusSintonen commented 4 months ago

@seifertm modified the milestones: v0.23, v1.0

It would have been great if the huge breaking changes would have been introduced only on 1.x version. Would it be possible to bring back the old working behavior to 0.x? Moving the current regressed one to 1.x-beta. 🙂 Looking at other issues it has caused a severe amount of different issues for large amount of projects.

seifertm commented 4 months ago

It would have been great if the huge breaking changes would have been introduced only on 1.x version.

I agree that bumping to v1.0 should have been done much earlier. However, pytest-asyncio versioning follows Semantic Versioning which states:

Major version zero (0.y.z) is for initial development. Anything MAY change at any time. The public API SHOULD NOT be considered stable.

In addition to that, breaking changes are explicitly marked as such in the changelog using the BREAKING prefix. Admittedly, this wasn't the case for the "separate caching and event loop scopes for fixtures" feature, because the use case flew under the radar. This resulted in bug report #706.

I'm open for suggestions to increasing the awareness of breaking changes or preventing such a topic to slip through.

Would it be possible to bring back the old working behavior to 0.x? Moving the current regressed one to 1.x-beta. 🙂

To my knowledge, most users that are affected by #706 stayed on pytest-asyncio v0.21. Is that an option for you? Alternatively, you may want to look into anyio, which comes with a pytest plugin.

Considering that the v0.23 version has been around since December last year, what procedure do you propose?

Looking at other issues it has caused a severe amount of different issues for large amount of projects.

I agree that the v0.23 release caused issues for a number of projects and that's unfortunate. At the same time, there are many more projects that have rather simple requirements and can use pytest-asyncio despite the existing bugs. They're just not as visible, because there's no need for them to file an issue. In general, it's always helpful for the pytest-asyncio maintainers if you link to projects where the upgrade from v0.21 to v0.23 went wrong.

MarkusSintonen commented 4 months ago

In addition to that, breaking changes are explicitly marked as such in the changelog using the BREAKING prefix. Admittedly, this wasn't the case for the "separate caching and event loop scopes for fixtures" feature, because the use case flew under the radar. This resulted in bug report https://github.com/pytest-dev/pytest-asyncio/issues/706.

Yeah I'm not seeing any mention about fixtures using async code being incombatible or broken in some scenarios. Even broken in some cases which are out of users control (examples here).

I'm open for suggestions to increasing the awareness of breaking changes or preventing such a topic to slip through.

Well the only good way to make users aware is to not get stuck into 0.x versions and use SemVer to communicate breaking changes. With that all the dependency management tooling also works.

To my knowledge, most users that are affected by https://github.com/pytest-dev/pytest-asyncio/issues/706 stayed on pytest-asyncio v0.21. Is that an option for you?

That is the only way, we are already pinning it to that and configuring automated dependency maintenancy to not touch pytest-asyncio. But surely this is not a good solution.

Considering that the v0.23 version has been around since December last year, what procedure do you propose?

Maybe creating v0.24 from pre v0.22 code. Also backporting any relevant bug fixes there that happened in between. It could also include all the missing unit tests that would guard against the issues users have been reporting (eg from this issue). Then spinning off v1.0-beta from current state with the previous tests in xfail-mode. Then finally releasing the v1.0 as the tests are fixed and other refactorings are done. I'm happy to help if you think it makes sense.

In general, it's always helpful for the pytest-asyncio maintainers if you link to projects where the upgrade from v0.21 to v0.23 went wrong.

I was just looking at the other issues in this repo (after the v0.22) and some issues having large amounts of thumbs-up reactions. I dont have anything other tangible to give there unfortunately. :/

seifertm commented 4 months ago

Maybe creating v0.24 from pre v0.22 code. Also backporting any relevant bug fixes there that happened in between. It could also include all the missing unit tests that would guard against the issues users have been reporting (eg from this issue). Then spinning off v1.0-beta from current state with the previous tests in xfail-mode. Then finally releasing the v1.0 as the tests are fixed and other refactorings are done. I'm happy to help if you think it makes sense.

Just by looking at the diff, reverting to 0.23 seems to be quite the effort: https://github.com/pytest-dev/pytest-asyncio/compare/v0.21.x...main

It also means that we need to provide a migration path for users who already upgraded to v0.23. They would then upgrade to v0.21 behavior, only to have breaking changes again when going to the 1.0(-beta).

I'm not saying we should entirely rule out moving back to the v0.21 behavior and reverting the event loop fixture deprecation. (You're not the only proponent of that approach.) I'm just saying, we should rather patch the current main branch to get the old behavior, instead of backporting changes. If we decide to revert the event loop fixture deprecation, that is.

For now, I'm betting on #871 to solve the major issue with pytest-asyncio v0.23. If it turns out this is a bad idea as well, then pytest-asyncio should probably go back to v0.21 behavior.

If I have a pre-release with the PR merged, do you mind if I ping you as a reviewer/tester? I believe your input would be valuable.

MarkusSintonen commented 4 months ago

If I have a pre-release with the PR merged, do you mind if I ping you as a reviewer/tester? I believe your input would be valuable.

Yes you can put me there as reviewer/tester!

For now, I'm betting on https://github.com/pytest-dev/pytest-asyncio/pull/871 to solve the major issue with pytest-asyncio v0.23. If it turns out this is a bad idea as well, then pytest-asyncio should probably go back to v0.21 behavior.

I tried your branch there but still I'm getting the same errors 🤔 Used the new param also:

[tool.pytest.ini_options]
asyncio_mode = "auto"
asyncio_default_fixture_loop_scope = "session"

I dont really understand what the new param means. Why are fixtures somehow run in varying event loop scopes and also tests are somehow run in different event loops, or something. 🤔 (Tried to look from homepage docs but they dont explain the new behaviour either very clearly).

seifertm commented 3 months ago

@MarkusSintonen As part of the pytest-asyncio 0.24.0a0 pre-release I updated the docs with how-to guides and a section explaining the new configuration setting. Do you think this is enough to make sense of it?

MarkusSintonen commented 3 months ago

MarkusSintonen As part of the pytest-asyncio 0.24.0a0 pre-release I updated the docs with how-to guides and a section explaining the new configuration setting. Do you think this is enough to make sense of it?

Not sure I understand what is the purpose of running fixtures and tests in different event loops. The docs dont explain it really. I can not either grasp how it changed from the previous working state. So why the new parameters are needed. Some migration guide would be probably needed also mentioning what is currently not working.