fastapi / fastapi

FastAPI framework, high performance, easy to learn, fast to code, ready for production
https://fastapi.tiangolo.com/
MIT License
77.16k stars 6.59k forks source link

Basic performance enhancement #559

Closed scheung38 closed 1 year ago

scheung38 commented 5 years ago

Description

How can I enable caching if not enabled by default when making API requests?

dmontagu commented 5 years ago

Can you describe more specifically what sort of caching you want to accomplish? (At the very least, do you mean client side or server side?)

scheung38 commented 5 years ago

Server side specifically thanks where can I find this information thanks. When testing FastAPI vs Flask I do notice that FASTAPI initial request is 1.2 sec then 0.3 sec on POSTMAN, but with Flask 1.2 sec request is always constant so I assume Flask there is no caching by default, whereas FASTAPI has?

dmontagu commented 5 years ago

There is no built-in caching mechanism in fastapi -- you are responsible for implementing your own caching.

(There is a dependency injection "cache", which prevents dependency injection functions from being executed multiple times per request, but I don't think that is what you are looking for.)

If you describe your caching use case you may get some recommendations for approaches.

scheung38 commented 5 years ago

Just a simple GET request, how to cache it thanks

dmontagu commented 5 years ago

functools.lru_cache is a built in that might be easiest.

alimcmaster1 commented 5 years ago

^ that's definitely the simplest solution for an in memory cache. If you are running nginx in front of fastAPI - it has some build in caching functionality: https://docs.nginx.com/nginx/admin-guide/content-cache/content-caching/#enable.

Hope that helps.

tiangolo commented 5 years ago

Speaking only about the initial request, if by any chance you are hitting the /openapi.json endpoint, that is created and reused (cached) on the first request to it.

Apart from that, there's no other caching.

github-actions[bot] commented 4 years ago

Assuming the original issue was solved, it will be automatically closed now. But feel free to add more comments or create new issues.