ipwnponies / pytest-antilru

Bust functools.lru_cache when running pytest to avoid test pollution
MIT License
20 stars 3 forks source link

Cache does not get emptied in wrapped LRU #28

Closed HansBambel closed 11 months ago

HansBambel commented 11 months ago

Thank you for this nice tool!

I noticed that the cache does not get emptied when using a reimplemented lru-cache.

Here I am using a timed lru-cache.

utils.py:

from functools import lru_cache, wraps
import random

def timed_lru_cache(seconds: int, maxsize: int = 128):
    """Create a timed LRU cache"""

    def wrapper_cache(func):
        func = lru_cache(maxsize=maxsize)(func)
        func.lifetime = timedelta(seconds=seconds)
        func.expiration = datetime.utcnow() + func.lifetime

        @wraps(func)
        def wrapped_func(*args, **kwargs):
            if datetime.utcnow() >= func.expiration:
                func.cache_clear()
                func.expiration = datetime.utcnow() + func.lifetime

            return func(*args, **kwargs)

        return wrapped_func

    return wrapper_cache

@timed_lru_cache(seconds=10)
def rand_num():
    return random.randint(1, 100_000_000)

test_utils.py:

from utils import rand_num

def test_timed_cache_1():
    value = rand_num()
    assert value == 1

def test_timed_cache_2():
    value = rand_num()
    assert value == 1

I inserted a breakpoint on the assert-line and noticed that the numbers are the same (repeatedly). Not sure how to show this differently.

Is there a way that these caches are also emptied?

HansBambel commented 11 months ago

Never mind! I managed to make it work by implementing my time cache better and calling rand_num.clear_cache() explicitly:

def timed_lru_cache(seconds: int, maxsize: int = 128, typed: bool = False):
    """Create a timed LRU cache"""
    def wrapper_cache(func):
        func = lru_cache(maxsize=maxsize, typed=typed)(func)
        func.delta = seconds * 10 ** 9
        func.expiration = time.monotonic_ns() + func.delta

        @wraps(func)
        def wrapped_func(*args, **kwargs):
            if time.monotonic_ns() >= func.expiration:
                func.cache_clear()
                func.expiration = time.monotonic_ns() + func.delta
            return func(*args, **kwargs)

        wrapped_func.cache_info = func.cache_info
        wrapped_func.cache_clear = func.cache_clear
        return wrapped_func

    return wrapper_cache

This was missing:

wrapped_func.cache_info = func.cache_info
wrapped_func.cache_clear = func.cache_clear

It still was not possible with antilru though :(

ipwnponies commented 11 months ago

I cannot reproduce this issue. The files copied verbatim, as listed in https://github.com/ipwnponies/pytest-antilru/issues/28#issue-1835174626, work as expected. Python 3.9.15

$ pip list
Package        Version
-------------- -------
exceptiongroup 1.1.2
iniconfig      2.0.0
packaging      23.1
pip            23.1.2
pluggy         1.2.0
pytest         7.4.0
pytest-antilru 1.1.1
setuptools     67.8.0
tomli          2.0.1
wheel          0.40.0

$ ls
__pycache__/  bin/  lib/  pyvenv.cfg  test_utils.py  utils.py

$ pytest -q test_utils.py
FF                                                                       [100%]
=================================== FAILURES ===================================
______________________________ test_timed_cache_1 ______________________________

    def test_timed_cache_1():
        value = rand_num()
>       assert value == 1
E       assert 8895233 == 1

test_utils.py:6: AssertionError
______________________________ test_timed_cache_2 ______________________________

    def test_timed_cache_2():
        value = rand_num()
>       assert value == 1
E       assert 27925160 == 1

test_utils.py:11: AssertionError
=========================== short test summary info ============================
FAILED test_utils.py::test_timed_cache_1 - assert 8895233 == 1
FAILED test_utils.py::test_timed_cache_2 - assert 27925160 == 1
2 failed in 0.04s

And works as expected when pytest-antilru is not installed:

$ pip uninstall pytest-antilru                                                                                                                                                                                                                                                                     
Found existing installation: pytest-antilru 1.1.1
Uninstalling pytest-antilru-1.1.1:
  Would remove:
    /taco/pytest-antilru/foo/lib/python3.9/site-packages/pytest_antilru-1.1.1.dist-info/*
    /taco/pytest-antilru/foo/lib/python3.9/site-packages/pytest_antilru/*
Proceed (Y/n)? y
  Successfully uninstalled pytest-antilru-1.1.1

$ pip list
Package        Version
-------------- -------
exceptiongroup 1.1.2
iniconfig      2.0.0
packaging      23.1
pip            23.1.2
pluggy         1.2.0
pytest         7.4.0
setuptools     67.8.0
tomli          2.0.1
wheel          0.40.0

$ pytest -q test_utils.py
FF                                                                       [100%]
=================================== FAILURES ===================================
______________________________ test_timed_cache_1 ______________________________

    def test_timed_cache_1():
        value = rand_num()
>       assert value == 1
E       assert 43921314 == 1

test_utils.py:6: AssertionError
______________________________ test_timed_cache_2 ______________________________

    def test_timed_cache_2():
        value = rand_num()
>       assert value == 1
E       assert 43921314 == 1

test_utils.py:11: AssertionError
=========================== short test summary info ============================
FAILED test_utils.py::test_timed_cache_1 - assert 43921314 == 1
FAILED test_utils.py::test_timed_cache_2 - assert 43921314 == 1
2 failed in 0.05s

This package monkey patches functools.lru_cache() at start of pytest collection (before application imports). In your timed_lru_cache decorator, when you reference lru_cache, it's already been patched and you're literally referencing the wrapper. There's no way to get the canonical functools.lru_cache, it's been shadowed entirely. So either caching doesn't work at all (bug in your code) or it has to go through this wrapper during pytest run.

Since your code works for me, I'd ask you to double-check your environment and reproduce again. If it's still happening, I'll need python and pytest version.

HansBambel commented 11 months ago

Thanks for the help.

In a clean environment it indeed works, also with Python 3.9.13 and pytest 7.1.3 (which I am using in the my env).

I guess it must be an interaction with other packages.

Package               Version
--------------------- ---------
anyio                 3.6.1
attrs                 22.1.0
black                 22.8.0
certifi               2022.9.24
cfgv                  3.3.1
charset-normalizer    2.1.1
click                 8.1.3
coverage              6.5.0
distlib               0.3.6
environs              9.5.0
fastapi               0.85.0
filelock              3.8.0
flake8                5.0.4
flake8-builtins       1.5.3
flake8-comprehensions 3.10.0
flake8-mutable        1.2.0
gunicorn              20.1.0
h11                   0.14.0
identify              2.5.6
idna                  3.4
iniconfig             1.1.1
marshmallow           3.18.0
mccabe                0.7.0
mypy                  0.981
mypy-extensions       0.4.3
nodeenv               1.7.0
numpy                 1.23.3
packaging             21.3
pandas                1.5.0
pathspec              0.10.1
pip                   22.2.2
platformdirs          2.5.2
pluggy                1.0.0
pre-commit            2.20.0
py                    1.11.0
pycodestyle           2.9.1
pydantic              1.10.2
pyflakes              2.5.0
pyparsing             3.0.9
pytest                7.1.3
pytest-antilru        1.1.1
pytest-asyncio        0.19.0
python-dateutil       2.8.2
python-dotenv         0.21.0
pytz                  2022.4
PyYAML                6.0
requests              2.28.1
responses             0.22.0
sentry-sdk            1.9.10
setuptools            65.4.1
six                   1.16.0
sniffio               1.3.0
starlette             0.20.4
toml                  0.10.2
tomli                 2.0.1
types-requests        2.28.11.1
types-toml            0.10.8
types-urllib3         1.26.25
typing_extensions     4.3.0
urllib3               1.26.12
uvicorn               0.18.3
virtualenv            20.16.5
wheel                 0.37.1

EDIT: After setting up a new toy environment it seems to work, but still not for my actual developing environment. I guess it is something in my code. I have to dig deeper.

ipwnponies commented 11 months ago

@HansBambel can you clarify on your edit? If you rebuild your project from scratch, everything works as expected?

FWIW the only known issues are due to interactions with other pytest packages (e.g. #27). But that's more of a loading order issue. This package is basically the decorator pattern and gets in quickly to decorate lru_cache before application code imports. I'd double check your timed_lru_cache implementation and make sure it's correctly decorating/wrapping in a similar fashion, and not inadvertently clobbering/shadowing the underlying lru_cache functionality. In fancier words, adhere to the Liskov substitution principle when making a derived cache.

HansBambel commented 11 months ago

@ipwnponies Sorry for not being clear enough :)

  1. In a separate environment where only pytest + pytest-antilru is installed the toy-example works.
  2. In a separate environment where all of our actual development packages are installed, the toy-example works.
  3. In our developing environment (even after deleting the env and creating it again like in 2.), the toy-example (and the actual functionality) does not work.

In 3. there is also other code than just the toy-example, which I believe is the culprit.

I will have a look at the loading order and also test whether it works with a normal lru_cache (haven't tried that one yet 😅 ) on monday.

HansBambel commented 11 months ago

It works with a normal lru_cache, so it has to do with the timed_lru_cache implementation.

I'm not sure how to test the loading order though. Thanks for the help anyway!