litestar-org / litestar

Production-ready, Light, Flexible and Extensible ASGI API framework | Effortlessly Build Performant APIs
https://litestar.dev/
MIT License
5.66k stars 384 forks source link

Enhancement: Add (or document) method for caching a route indefinitely #1365

Closed ubernostrum closed 1 year ago

ubernostrum commented 1 year ago

Summary

Currently, the low-level cache access permits setting a value which will be cached indefinitely, by passing expiration=None to the cache's set() method. However, the cache argument to route decorators does not accept None (and raises an exception if cache=None is passed), and I can find no other mechanism for specifying that the response from a route should be cached indefinitely.

Basic Example

It would be possible to write:

from starlite import get

@get("/", cache=<sentinel value indicating no expiration>)
async def my_indefinitely_cached_handler() -> str:
    ...

and have the response be cached with no explicit expiration.

Drawbacks and Impact

I'm not sure that there would be any negative impact beyond enabling a use case like this; no existing cache functionality would be removed or changed in a backwards incompatible way, as this is purely adding to the cache functionality.

For positive impact, it would be possible to have endpoints that effectively serve out of cache forever.

For a concrete example, I currently have an application which wants to do this due to having a data set that changes rarely and is expensive to query on the fly; I've instrumented its data-update logic to both insert the new data set into the database and also calculate and clear the affected set of cache key(s), after which I'd like the endpoints which serve the data to just query the DB once and then cache "forever" (or until the next time a data update occurs and clears some keys out of the cache).

At the moment I'm simulating this as best I can by doing the caching in my DB query layer. If I could reliably work out how generate the correct Starlite Response objects I could just pickle them and do a low-level cache set() with the appropriate key and with expiration=None, but ideally I would just let Starlite itself generate the responses and cache them with indefinite expiration.

Unresolved questions

The main question I can see is what the correct value for the cache argument would be. With the low-level cache set() method (at least on Redis, which is the cache backend I'm using), a key can be set not to expire by passing expiration=None, but passing cache=None in a route decorator feels like it should have the semantics of not caching at all, rather than of caching indefinitely. So probably some kind of special sentinel value would be needed which could be passed in -- perhaps something like:

from starlite import get
from starlite.cache import NO_EXPIRATION

@get("/", cache=NO_EXPIRATION)
async def my_indefinitely_cached_handler() -> str:
    ...

I believe internally Redis implements a non-expiring key as having an expiration "timestamp" of -1, so perhaps either that value, or a named constant holding that value, would work.

provinzkraut commented 1 year ago

So, we actually discussed this previously, but to be honest, I can't recall what the verdict was. @Goldziher?

In any case, there should be a way. I remember raising the point that, from a semantic perspective, cache=True should cache indefinitely, but that would then make the default expiration require some sort of sentinel value.

Personally I don't particularly care for -1, as it's not very explicit. We could however add a sentinel value like NO_EXPIRATION which would achieve the same effect.

Alternatively, and as a workaround, you could just pass something like timedelta(days=365).total_seconds() to cache for a year (multiply as your heart desires).

Goldziher commented 1 year ago

Sentinel sounds good

provinzkraut commented 1 year ago

Related issue: #1301

ubernostrum commented 1 year ago

Thanks for the quick turnaround. I assume this will ship in 2.0?