Documentation |
---|
A lightweight caching library which leverages FastAPI's middleware functionality and follows best practices of cache-control to easily speed up your large requests.
The packages can be included in your project by running
pip install cache-fastapi
OR poetry add cache-fastapi
To get started you need to import the cache-fastapi middleware and then add it to your FastAPI app
from fastapi import FastAPI
from cache_fastapi.cacheMiddleware import CacheMiddleware
from cache_fastapi.Backends.memory_backend import MemoryBackend
cached_endpoints = [
"/test"
]
app = FastAPI()
backend = MemoryBackend()
app.add_middleware(CacheMiddleware, cached_endpoints=cached_endpoints, backend=backend)
For nested routes just add the base route:
For example if you have routes /data/{data_id}
, you can just add /data/
to the cached_endpoints
list
cached_endpoints = [
"/test",
"/data/"
]
The cached_endpoints can be used to define all the endpoints you want to cache. This gives you can central place where you can keep track of all the cached endpoints.
The default cache max-age is 60 secs, to overwrite that in the API request send the following header
Cache-Control: max-age=time
Here time needs to be in seconds.
Once a response is cached, you'll be able to get the cached response age in the response header Cache-Control
To overwrite the cache and get a fresh response use
Cache-Control: no-cache
If you want to make a request and do not want that request to be considered for caching you can use
Cache-Control: no-store
The library supports multiple backends for various use cases:
REDIS_URL
environment variable.