long2ice / fastapi-cache

fastapi-cache is a tool to cache fastapi response and function result, with backends support redis and memcached.
https://github.com/long2ice/fastapi-cache
Apache License 2.0
1.3k stars 161 forks source link

redis produces multiple keys for the same request because args/kwargs aren't encoded #26

Open dveleztx opened 3 years ago

dveleztx commented 3 years ago

Redis Multiple Keys

As seen from the screenshot, when trying to reach the same endpoint before expiring, nothing is being retrieved from redis.

falled10 commented 3 years ago

@long2ice can you check this one, please? I have the same issue

dveleztx commented 3 years ago

@long2ice Any progress on this? As stated, caching isn't actually working as shown above.

dveleztx commented 3 years ago

@long2ice Status?

k1dav commented 3 years ago

@dveleztx I think you can write your own key_builder to deal with the issue. And I test your code on some situations will occur errors, e.g. kwargs has cannot encode data (maybe pydantic model)

aimfeld commented 2 years ago

@dveleztx I had the same problem, and in my case the different cache keys where caused by enum parameters and a DB session parameter. I wrote a custom key builder to fix it:

def api_key_builder(
        func,
        namespace: Optional[str] = "",
        request: Optional[Request] = None,
        response: Optional[Response] = None,
        args: Optional[tuple] = None,
        kwargs: Optional[dict] = None,
):
    """
    Handle Enum and Session params properly.
    """
    prefix = f"{FastAPICache.get_prefix()}:{namespace}:"

    # Remove session and convert Enum parameters to strings
    arguments = {}
    for key, value in kwargs.items():
        if key != 'session':
            arguments[key] = value.value if isinstance(value, Enum) else value

    cache_key = prefix + hashlib.md5(f"{func.__module__}:{func.__name__}:{args}:{arguments}".encode()).hexdigest()

    return cache_key

I then use my custom api_key_builder globally like this: FastAPICache.init(InMemoryBackend(), prefix='fastapi', key_builder=api_key_builder)

antont commented 1 year ago

I remove both session and request from the key.

            request: Request = copy_kwargs.pop("request")
            del copy_kwargs['session']
sebaxtian commented 1 year ago

Thank you @aimfeld I solved the issue.

I have an API that uses an SQLite DB, that issue is due to SQLAlchemy generating a different Session object, for instance:

kwargs.items(): dict_items([('db', <sqlalchemy.orm.session.Session object at 0x7f92ad2f1120>), ('skip', 5), ('limit', 3)])
<sqlalchemy.orm.session.Session object at 0x7f92ad3b2ec0>
kwargs.items(): dict_items([('db', <sqlalchemy.orm.session.Session object at 0x7f92ad3b3940>), ('skip', 5), ('limit', 3)])
<sqlalchemy.orm.session.Session object at 0x7f92ad3b3940>

@dveleztx I solved the issue as you mentioned:

def api_key_builder(
    func: Callable,
    namespace: Optional[str] = "",
    request: Optional[Request] = None,
    response: Optional[Response] = None,
    args: Optional[tuple] = None,
    kwargs: Optional[dict] = None,
) -> str:
    from fastapi_cache import FastAPICache

    # SOLUTION: https://github.com/long2ice/fastapi-cache/issues/26
    #print("kwargs.items():", kwargs.items())
    arguments = {}
    for key, value in kwargs.items():
        if key != 'db':
            arguments[key] = value
    #print("request:", request, "request.base_url:", request.base_url, "request.url:", request.url)
    arguments['url'] = request.url
    #print("arguments:", arguments)

    prefix = f"{FastAPICache.get_prefix()}:{namespace}:"
    cache_key = (
        prefix
        + hashlib.md5(  # nosec:B303
            f"{func.__module__}:{func.__name__}:{args}:{arguments}".encode()
        ).hexdigest()
    )
    return cache_key

Then use the new key builder on your FastAPICache init:

redis = aioredis.from_url(get_settings().url_redis)
FastAPICache.init(RedisBackend(redis), prefix="tangara-cache", key_builder=api_key_builder)

Thank you so much, maybe someone still having the same issue o similar using PostgreSQL or any other DB, this solution could help them.

🚲