Closed scheung38 closed 1 year ago
Can you describe more specifically what sort of caching you want to accomplish? (At the very least, do you mean client side or server side?)
Server side specifically thanks where can I find this information thanks. When testing FastAPI vs Flask I do notice that FASTAPI initial request is 1.2 sec then 0.3 sec on POSTMAN, but with Flask 1.2 sec request is always constant so I assume Flask there is no caching by default, whereas FASTAPI has?
There is no built-in caching mechanism in fastapi -- you are responsible for implementing your own caching.
(There is a dependency injection "cache", which prevents dependency injection functions from being executed multiple times per request, but I don't think that is what you are looking for.)
If you describe your caching use case you may get some recommendations for approaches.
Just a simple GET request, how to cache it thanks
functools.lru_cache is a built in that might be easiest.
^ that's definitely the simplest solution for an in memory cache. If you are running nginx in front of fastAPI - it has some build in caching functionality: https://docs.nginx.com/nginx/admin-guide/content-cache/content-caching/#enable.
Hope that helps.
Speaking only about the initial request, if by any chance you are hitting the /openapi.json
endpoint, that is created and reused (cached) on the first request to it.
Apart from that, there's no other caching.
Assuming the original issue was solved, it will be automatically closed now. But feel free to add more comments or create new issues.
Description
How can I enable caching if not enabled by default when making API requests?